Science.gov

Sample records for 2 log likelihood

  1. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  2. Fast inference in generalized linear models via expected log-likelihoods.

    PubMed

    Ramirez, Alexandro D; Paninski, Liam

    2014-04-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting "expected log-likelihood" can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina.

  3. Fast inference in generalized linear models via expected log-likelihoods

    PubMed Central

    Ramirez, Alexandro D.; Paninski, Liam

    2015-01-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting “expected log-likelihood” can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina. PMID:23832289

  4. A sampling approach to estimate the log determinant used in spatial likelihood problems

    NASA Astrophysics Data System (ADS)

    Pace, R. Kelley; Lesage, James P.

    2009-09-01

    Likelihood-based methods for modeling multivariate Gaussian spatial data have desirable statistical characteristics, but the practicality of these methods for massive georeferenced data sets is often questioned. A sampling algorithm is proposed that exploits a relationship involving log-pivots arising from matrix decompositions used to compute the log determinant term that appears in the model likelihood. We demonstrate that the method can be used to successfully estimate log-determinants for large numbers of observations. Specifically, we produce an log-determinant estimate for a 3,954,400 by 3,954,400 matrix in less than two minutes on a desktop computer. The proposed method involves computations that are independent, making it amenable to out-of-core computation as well as to coarse-grained parallel or distributed processing. The proposed technique yields an estimated log-determinant and associated confidence interval.

  5. Utterance Verification Using State-Level Log-Likelihood Ratio with Frame and State Selection

    NASA Astrophysics Data System (ADS)

    Kwon, Suk-Bong; Kim, Hoirin

    This paper suggests utterance verification system using state-level log-likelihood ratio with frame and state selection. We use hidden Markov models for speech recognition and utterance verification as acoustic models and anti-phone models. The hidden Markov models have three states and each state represents different characteristics of a phone. Thus we propose an algorithm to compute state-level log-likelihood ratio and give weights on states for obtaining more reliable confidence measure of recognized phones. Additionally, we propose a frame selection algorithm to compute confidence measure on frames including proper speech in the input speech. In general, phone segmentation information obtained from speaker-independent speech recognition system is not accurate because triphone-based acoustic models are difficult to effectively train for covering diverse pronunciation and coarticulation effect. So, it is more difficult to find the right matched states when obtaining state segmentation information. A state selection algorithm is suggested for finding valid states. The proposed method using state-level log-likelihood ratio with frame and state selection shows that the relative reduction in equal error rate is 18.1% compared to the baseline system using simple phone-level log-likelihood ratios.

  6. Binary Detection using Multi-Hypothesis Log-Likelihood, Image Processing

    DTIC Science & Technology

    2014-03-27

    xi I. Introduction ...Projects Agency GEO Geostationary Earth Orbit xi BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING I. Introduction Comparing...is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically

  7. Posterior propriety for hierarchical models with log-likelihoods that have norm bounds

    SciTech Connect

    Michalak, Sarah E.; Morris, Carl N.

    2015-07-17

    Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonly used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).

  8. Posterior propriety for hierarchical models with log-likelihoods that have norm bounds

    DOE PAGES

    Michalak, Sarah E.; Morris, Carl N.

    2015-07-17

    Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less

  9. Maximum likelihood estimation of the mixture of log-concave densities.

    PubMed

    Hu, Hao; Wu, Yichao; Yao, Weixin

    2016-09-01

    Finite mixture models are useful tools and can be estimated via the EM algorithm. A main drawback is the strong parametric assumption about the component densities. In this paper, a much more flexible mixture model is considered, which assumes each component density to be log-concave. Under fairly general conditions, the log-concave maximum likelihood estimator (LCMLE) exists and is consistent. Numeric examples are also made to demonstrate that the LCMLE improves the clustering results while comparing with the traditional MLE for parametric mixture models.

  10. 2. Onroom log cabin (right), log root cellar (center), tworoom ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  11. A Planar Approximation for the Least Reliable Bit Log-likelihood Ratio of 8-PSK Modulation

    NASA Technical Reports Server (NTRS)

    Thesling, William H.; Vanderaar, Mark J.

    1994-01-01

    The optimum decoding of component codes in block coded modulation (BCM) schemes requires the use of the log-likelihood ratio (LLR) as the signal metric. An approximation to the LLR for the least reliable bit (LRB) in an 8-PSK modulation based on planar equations with fixed point arithmetic is developed that is both accurate and easily realizable for practical BCM schemes. Through an error power analysis and an example simulation it is shown that the approximation results in 0.06 dB in degradation over the exact expression at an E(sub s)/N(sub o) of 10 dB. It is also shown that the approximation can be realized in combinatorial logic using roughly 7300 transistors. This compares favorably to a look up table approach in typical systems.

  12. A log-likelihood-gain intensity target for crystallographic phasing that accounts for experimental error

    PubMed Central

    Read, Randy J.; McCoy, Airlie J.

    2016-01-01

    The crystallographic diffraction experiment measures Bragg intensities; crystallo­graphic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978 ▸), Acta Cryst. A35, 517–525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124

  13. Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density.

    PubMed

    Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A

    2009-06-01

    We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f(0) = exp varphi(0) where varphi(0) is a concave function on R. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log-concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, infinity) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of H(k), the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of varphi(0) = log f(0) at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f(0)) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.

  14. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    PubMed

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  15. Standardized likelihood ratio test for comparing several log-normal means and confidence interval for the common mean.

    PubMed

    Krishnamoorthy, K; Oral, Evrim

    2015-11-26

    Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.

  16. Segmentation of diesel spray images with log-likelihood ratio test algorithm for non-Gaussian distributions.

    PubMed

    Pastor, José V; Arrègle, Jean; García, José M; Zapata, L Daniel

    2007-02-20

    A methodology for processing images of diesel sprays under different experimental situations is presented. The new approach has been developed for cases where the background does not follow a Gaussian distribution but a positive bias appears. In such cases, the lognormal and the gamma probability density functions have been considered for the background digital level distributions. Two different algorithms have been compared with the standard log-likelihood ratio test (LRT): a threshold defined from the cumulative probability density function of the background shows a sensitive improvement, but the best results are obtained with modified versions of the LRT algorithm adapted to non-Gaussian cases.

  17. 2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  18. log(MPl/m3/2)

    SciTech Connect

    Loaiza-Brito, Oscar; Martin, Johannes; Nilles, Hans Peter; Ratz, Michael

    2005-12-02

    Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-) which corresponds to the scale of gaugino condensation. Under rather general circumstances, this leads to a scenario of moduli stabilization, which is endowed with a hierarchy between the mass of the lightest modulus, the gravitino mass and the scale of the soft terms, mmodulus {approx} m3/2 {approx} 2 msoft. The 'little hierarchy' is given by the logarithm of the ratio of the Planck scale and the gravitino mass, {approx} log(MPl/m3/2) {approx} 4{pi}2. This exhibits a new mediation scheme of supersymmetry breakdown, called mirage mediation. We highlight the special properties of the scheme, and their consequences for phenomenology and cosmology.

  19. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  20. Continuous Tamper-proof Logging using TPM2.0

    DTIC Science & Technology

    2014-06-16

    Continuous Tamper-proof Logging using TPM2.0 Paul England, Limin Jia, James Lorch, and Arunesh Sinha July 9, 2013 (Revised June 16...00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Continuous Tamper-proof Logging using TPM2.0 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Auditing system logs is an important means of ensuring systems’ security in situations where run-time

  1. 2D Log-Gabor Wavelet Based Action Recognition

    NASA Astrophysics Data System (ADS)

    Li, Ning; Xu, De

    The frequency response of log-Gabor function matches well the frequency response of primate visual neurons. In this letter, motion-salient regions are extracted based on the 2D log-Gabor wavelet transform of the spatio-temporal form of actions. A supervised classification technique is then used to classify the actions. The proposed method is robust to the irregular segmentation of actors. Moreover, the 2D log-Gabor wavelet permits more compact representation of actions than the recent neurobiological models using Gabor wavelet.

  2. Core log: Valles caldera No. 2A, New Mexico

    SciTech Connect

    Starguist, V.L.

    1988-01-01

    Scientific core hole VC-2A was drilled into the western ring-fracture zone at Sulphur Springs in the Valles caldera, New Mexico. VC-2A, the second scientific core hole in the caldera, was cored through a faulted and brecciated sequence of intracauldron tuffs and volcaniclastic rocks to a depth of 528 m. As of November 1, 1986, the unequilibrated bottom-hole temperature was 212/degree/C. The rocks penetrated are intensely altered and host sub-ore grade stockwork molybdenite mineralization between 25 and 125 m. This report contains a detailed core log to aid researchers in their studies of the Valles caldera magma hydrothermal system. 3 refs., 2 figs.

  3. A complete X-ray sample of the high latitude sky from HEAO-1 A-2: log N lo S and luminosity functions

    NASA Technical Reports Server (NTRS)

    Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.

    1981-01-01

    An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.

  4. Functional conservation between mammalian MGRN1 and plant LOG2 ubiquitin ligases.

    PubMed

    Guerra, Damian D; Pratelli, Réjane; Kraft, Edward; Callis, Judy; Pilot, Guillaume

    2013-11-01

    Plant LOSS OF GDU 2 (LOG2) and Mammalian Mahogunin Ring Finger 1 (MGRN1) proteins are RING-type E3 ligases sharing similarity N-terminal to the RING domain. Deletion of this region disrupts the interaction of LOG2 with the plant membrane protein GLUTAMINE DUMPER1 (GDU1). Phylogenetic analysis identified two clades of LOG2/MGRN1-like proteins in vertebrates and plants. The ability of MGRN1 to functionally replace LOG2 was tested. MGRN1 ubiquitylates GDU1 in vitro and can partially substitute for LOG2 in the plant, partially restoring amino acid resistance to a GDU1-myc over-expression, log2-2 background. Altogether, these results suggest a conserved function for the N-terminal domain in evolution.

  5. Assessment of geophysical logs from borehole USW G-2, Yucca Mountain, Nevada

    SciTech Connect

    Nelson, P.H.; Schimschal, U.

    1993-05-01

    Commercial logging contractors, Western Atlas, Schlumberger, and Edcon obtained borehole geophysical logs at the site of a potential high level nuclear waste repository at Yucca Mountain, Nevada. Drill hole USW-G2 was picked for this test of suitable logging tools and logging technology, both representing state-of-the-art technology by these commercial companies. Experience gained by analysis of existing core data and a variety of logs obtained earlier by Birdwell and Dresser Atlas served as a guide to a choice of logs to be obtained. Logs were obtained in water-filled borehole in zeolitized tuff (saturated zone) and in air-filled borehole largely in unaltered welded tuff (unsaturated zone).

  6. 2-Step Maximum Likelihood Channel Estimation for Multicode DS-CDMA with Frequency-Domain Equalization

    NASA Astrophysics Data System (ADS)

    Kojima, Yohei; Takeda, Kazuaki; Adachi, Fumiyuki

    Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide better downlink bit error rate (BER) performance of direct sequence code division multiple access (DS-CDMA) than the conventional rake combining in a frequency-selective fading channel. FDE requires accurate channel estimation. In this paper, we propose a new 2-step maximum likelihood channel estimation (MLCE) for DS-CDMA with FDE in a very slow frequency-selective fading environment. The 1st step uses the conventional pilot-assisted MMSE-CE and the 2nd step carries out the MLCE using decision feedback from the 1st step. The BER performance improvement achieved by 2-step MLCE over pilot assisted MMSE-CE is confirmed by computer simulation.

  7. Logging Report 1 – Temperature/Gamma Ray/Collar Locator Log, EE-2A, 2450 psig Shut-in Log, August 8, 1991

    SciTech Connect

    Dreesen, Donald S.; Anderson, David W.

    1991-08-14

    Oil Well Perforators (OWP) was mobilized on August 5th and arrived on August 6th. OWP's grease injection head was installed on the LANL wireline and the wireline was re-headed with a LANL cable head. OWP provided the following pressure control equipment: a 40 ft long 5-1/2" API 5000 psig WP (working pressure) casing lubricator, a 5-1/2" API 3000 psig WP wireline blow-out-preventer, a 15000 psig WP grease injection head with 0.425-0.430" flow tubes and a grease injection system, and the 15000 psig WP high pressure hoses needed to connect the grease system to the grease head. The grease head included three flow sections, two active and one reserve, and two injection hoses were run to the head. All of the instruments operated throughout the logging operation. There were no problems with lowering or raising the tools. The pressure control equipment performed well and leakage at the grease head was insignificant.

  8. Geomechanical Reservoir Characterization for Prediction of the Likelihood of Faulting Induced by CO2 Sequestration

    NASA Astrophysics Data System (ADS)

    Chiaramonte, L.; Zoback, M.; Friedmann, J.; Stamp, V.

    2006-12-01

    Geologic sequestration of CO2 has been proposed as a key technology for reducing greenhouse gas in the atmosphere. However, leakage of CO2 from any potential reservoir could seriously compromise the goal of long-term storage. Therefore understanding the likelihood of leakage is a key step toward the feasibility of this technology. Because it has long been known that fluid injection can lead to fault reactivation, assessing the potential of induced fault slip and leakage of CO2 from any potential storage reservoir is critical prior to injection of large quantities of CO2. We have developed a geomechanical model in advance of a proposed injection experiment at Teapot Dome, WY in order to study in detail the processes of a CO2 leak through predicted faults. Teapot Dome is an elongated asymmetrical, basement-cored anticline with a north-northeast axis. It is part of the Salt Creek structural trend, located in the southwestern edge of the Powder River Basin. The anticline is compartmentalized in a series of oblique-slip faults, generally oriented NE-SW, although varying in geometry, displacement, and complexity. In particular, the fault network referred to as S2 shows a greater complexity and a range of geometries and azimuths that enhances its potential for leakage. Furthermore its surface expression show alkali springs and hydrocarbon samples within the fault veins and gouge. The S2 fault intersects the oil-bearing 2nd Wall Creek around 650m, which is close in terms of temperature and pressure to conditions of supercritical CO2 phase injection and oil miscibility. We developed a preliminary geomechanical model of the deeper Tensleep Fm. previously, utilizing observations of wellbore breakouts, drilling induced fractures, leak-off or mini-frac tests and other available geophysical data. We used this information to evaluate the potential for injection to induce slip on the S1 fault, approximately 2 km south of the present study area. These results are compared and

  9. Predicting the likelihood of an isocitrate dehydrogenase 1 or 2 mutation in diagnoses of infiltrative glioma

    PubMed Central

    Chen, Li; Voronovich, Zoya; Clark, Kenneth; Hands, Isaac; Mannas, Jonathan; Walsh, Meggen; Nikiforova, Marina N.; Durbin, Eric B.; Weiss, Heidi; Horbinski, Craig

    2014-01-01

    Background Several variables are associated with the likelihood of isocitrate dehydrogenase 1 or 2 (IDH1/2) mutation in gliomas, though no guidelines yet exist for when testing is warranted, especially when an R132H IDH1 immunostain is negative. Methods A cohort of 89 patients was used to build IDH1/2 mutation prediction models in World Health Organization grades II–IV gliomas, and an external cohort of 100 patients was used for validation. Logistic regression and backward model selection with the Akaike information criterion were used to develop prediction models. Results A multivariable model, incorporating patient age, glioblastoma multiforme diagnosis, and prior history of grade II or III glioma, was developed to predict IDH1/2 mutation probability. This model generated an area under the curve (AUC) of 0.934 (95% CI: 0.878, 0.978) in the external validation cohort and 0.941 (95% CI: 0.918, 0.962) in the cohort of The Cancer Genome Atlas. When R132H IDH1 immunostain information was added, AUC increased to 0.986 (95% CI: 0.967, 0.998). This model had an AUC of 0.947 (95% CI: 0.891, 0.995) in predicting whether an R132H IDH1 immunonegative case harbored a less common IDH1 or IDH2 mutation. The models were also 94% accurate in predicting IDH1/2 mutation status in gliomas from The Cancer Genome Atlas. An interactive web-based application for calculating the probability of an IDH1/2 mutation is now available using these models. Conclusions We have integrated multiple variables to generate a probability of an IDH1/2 mutation. The associated web-based application can help triage diffuse gliomas that would benefit from mutation testing in both clinical and research settings. PMID:24860178

  10. In situ fluid typing and quantification with 1D and 2D NMR logging.

    PubMed

    Sun, Boqin

    2007-05-01

    In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples.

  11. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, 2

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Walker, H. F.

    1976-01-01

    The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.

  12. Optimization of a Nucleic Acids united-RESidue 2-Point model (NARES-2P) with a maximum-likelihood approach

    SciTech Connect

    He, Yi; Scheraga, Harold A.; Liwo, Adam

    2015-12-28

    Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.

  13. Optimization of a Nucleic Acids united-RESidue 2-Point model (NARES-2P) with a maximum-likelihood approach

    PubMed Central

    He, Yi; Liwo, Adam; Scheraga, Harold A.

    2015-01-01

    Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field. PMID:26723596

  14. The phylogenetic likelihood library.

    PubMed

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL).

  15. Approximate likelihood for large irregularly spaced spatial data

    PubMed Central

    Fuentes, Montserrat

    2008-01-01

    SUMMARY Likelihood approaches for large irregularly spaced spatial datasets are often very difficult, if not infeasible, to implement due to computational limitations. Even when we can assume normality, exact calculations of the likelihood for a Gaussian spatial process observed at n locations requires O(n3) operations. We present a version of Whittle’s approximation to the Gaussian log likelihood for spatial regular lattices with missing values and for irregularly spaced datasets. This method requires O(nlog2n) operations and does not involve calculating determinants. We present simulations and theoretical results to show the benefits and the performance of the spatial likelihood approximation method presented here for spatial irregularly spaced datasets and lattices with missing values. We apply these methods to estimate the spatial structure of sea surface temperatures (SST) using satellite data with missing values. PMID:19079638

  16. Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods

    PubMed Central

    McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando

    2010-01-01

    Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349

  17. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false How soon after I know of the likelihood of... Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a)...

  18. The Likelihood Function and Likelihood Statistics

    NASA Astrophysics Data System (ADS)

    Robinson, Edward L.

    2016-01-01

    The likelihood function is a necessary component of Bayesian statistics but not of frequentist statistics. The likelihood function can, however, serve as the foundation for an attractive variant of frequentist statistics sometimes called likelihood statistics. We will first discuss the definition and meaning of the likelihood function, giving some examples of its use and abuse - most notably in the so-called prosecutor's fallacy. Maximum likelihood estimation is the aspect of likelihood statistics familiar to most people. When data points are known to have Gaussian probability distributions, maximum likelihood parameter estimation leads directly to least-squares estimation. When the data points have non-Gaussian distributions, least-squares estimation is no longer appropriate. We will show how the maximum likelihood principle leads to logical alternatives to least squares estimation for non-Gaussian distributions, taking the Poisson distribution as an example.The likelihood ratio is the ratio of the likelihoods of, for example, two hypotheses or two parameters. Likelihood ratios can be treated much like un-normalized probability distributions, greatly extending the applicability and utility of likelihood statistics. Likelihood ratios are prone to the same complexities that afflict posterior probability distributions in Bayesian statistics. We will show how meaningful information can be extracted from likelihood ratios by the Laplace approximation, by marginalizing, or by Markov chain Monte Carlo sampling.

  19. Comprehensive study of LASL Well C/T-2 Roosevelt Hot Springs KGRA, Utah, and applications to geothermal well logging

    SciTech Connect

    Glenn, W.E.; Hulen, J.B.; Nielson, D.L.

    1981-02-01

    Utah State Geothermal Well 9-1 in the Roosevelt Hot Springs KGRA, Beaver County, Utah, has been donated by Phillips Petroleum Company for calibration and testing of well-logging equipment in the hot, corrosive, geothermal environment. It is the second Calibration/Test Well (C/T-2) in the Geothermal Log Interpretation Program. A study of cuttings and well logs from Well C/T-2 was completed. This synthesis and data presentation contains most of the subsurface geologic information needed to effect the total evaluation of geophysical logs acquired in this geothermal calibration/test well, C/T-2.

  20. Risk algorithms that include pathology adjustment for HER2 amplification need to make further downward adjustments in likelihood scores.

    PubMed

    Evans, D G; Woodward, E R; Howell, S J; Verhoef, S; Howell, A; Lalloo, F

    2017-04-01

    To assess the need for adjustment in the likelihood of germline BRCA1/2 mutations in women with HER2+ breast cancers. We analysed primary mutation screens on women with breast cancer with unequivocal HER2 overexpression and assessed the likelihood of BRCA1/BRCA2 mutations by age, oestrogen receptor status and Manchester score. Of 1111 primary BRCA screens with confirmed HER2 status only 4/161 (2.5%) of women with HER2 amplification had a BRCA1 mutation identified and 5/161 (3.1%) a BRCA2 mutation. The pathology adjusted Manchester score between 10 and 19% and 20%+ thresholds resulted in a detection rate of only 6.5 and 15% respectively. BOADICEA examples appeared to make even less downward adjustment. There is a very low detection rate of BRCA1 and BRCA2 mutations in women with HER2 amplified breast cancers. The Manchester score and BOADICEA do not make sufficient downward adjustment for HER2 amplification. For unaffected women, assessment of breast cancer risk and BRCA1/2 probability should take into account the pathology of the most relevant close relative. Unaffected women undergoing mutation testing for BRCA1/2 should be advised that there is limited reassurance from a negative test result if their close relative had a HER2+ breast cancer.

  1. Log-Tool

    SciTech Connect

    Goodall, John

    2012-05-21

    Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input – it will input data from files, standard input, and syslog, (2) Parser – it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output – it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.

  2. Maximum Likelihood, Profile Likelihood, and Penalized Likelihood: A Primer

    PubMed Central

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander

    2014-01-01

    The method of maximum likelihood is widely used in epidemiology, yet many epidemiologists receive little or no education in the conceptual underpinnings of the approach. Here we provide a primer on maximum likelihood and some important extensions which have proven useful in epidemiologic research, and which reveal connections between maximum likelihood and Bayesian methods. For a given data set and probability model, maximum likelihood finds values of the model parameters that give the observed data the highest probability. As with all inferential statistical methods, maximum likelihood is based on an assumed model and cannot account for bias sources that are not controlled by the model or the study design. Maximum likelihood is nonetheless popular, because it is computationally straightforward and intuitive and because maximum likelihood estimators have desirable large-sample properties in the (largely fictitious) case in which the model has been correctly specified. Here, we work through an example to illustrate the mechanics of maximum likelihood estimation and indicate how improvements can be made easily with commercial software. We then describe recent extensions and generalizations which are better suited to observational health research and which should arguably replace standard maximum likelihood as the default method. PMID:24173548

  3. The likelihood ratio and frequency of DQ2/DQ8 haplotypes in Iranian patients with celiac disease

    PubMed Central

    Khosravi, Asghar; Mansouri, Masoume; Rostami-Nejad, Mohammad; Shahbazkhani, Bijan; Ekhlasi, Golnaz; Kalantari, Ebrahim

    2016-01-01

    Aim: The aim of this study was to evaluate the likelihood ratio and frequency of DQ2 and DQ8 in Iranian patients with celiac disease (CD). Background: The HLA DQ2 and HLA DQ8 are the important mediators in the development of celiac disease. A few studies evaluated the frequency of HLA DQ2 and HLA DQ8 haplotypes among the Iranian population with low sample size. Patients and methods: In this cross-sectional study, to predict HLA–DQ2 and DQ8 haplotypes, 141(73 male, 78 female) confirmed CD patients compared to 151 healthy controls were enrolled into this study during 2013-2014. HLA DQ2/ DQ8 haplotypes was determined in cases and controls using PCR-SSP technique. Results: DQ2 and DQ8 were positive in 80% (n=111) and 49% (n= 69) of CD patients and 36% (n=61) and 13% (n=21) of control group respectively. Moreover, 32% (n=45) of CD patients and 5.3% (n=8) of the control group were carrier of both haplotypes. In the case group about one-third of patients (32.2%) were positive for carrying both DQ2 and DQ8 heterodimers while only 5.3% (n=8) of the control group were carrier. In addition, the positive likelihood ratio of DQ2 and DQ8 were 1.74 (CI: 1.4- 2.1), and 2.6 (CI: 1.8– 2.7), respectively. Conclusion: The result of this study showed that the frequency of DQ8 among our population is higher than those reported by European countries, but it is close to those founded in South America and Middle East. This result suggests that the higher prevalence of HLA DQ8 pattern in Iranian CD patients is similar to non-European patients. PMID:26744610

  4. Augmented Likelihood Image Reconstruction.

    PubMed

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  5. Well Log ETL tool

    SciTech Connect

    Good, Jessica

    2013-08-01

    This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

  6. Analysis of geophysical well logs obtained in the State 2-14 borehole, Salton Sea geothermal area, California

    USGS Publications Warehouse

    Paillet, Frederick L.; Morin, R.H.

    1988-01-01

    A complete suite of conventional geophysical well logs was obtained in the upper part of a 3220-m-deep borehole drilled into geothermally altered alluvial sediments on the southeastern edge of the Salton Sea. Geophysical logs obtained in the State 2-14 borehole indicate that neutron porosity, gamma-gamma, and deep-induction logs provide useful information on lithologic trends with depth. The natural gamma log contains almost continuous, high-frequency fluctuations that obscure lithologic trends and that may be related to recent radioisotope redistribution and departure from radiometric equilibrium. Acoustic transit time logs give unrealistically low in situ compressional velocities ranging from 1.8 to 3.0 km/s, whereas acoustic waveform logs indicate that sediment compressional velocities range from less than 3.0 km/s shallower than 1000 m in depth to almost 5.0 km/s at depths greater than 2000 m. Analyses indicate that most log values lie between two lithologic end points: an electrically conductive claystone with moderate neutron porosity, but no effective porosity, and an electrically nonconductive, fully cemented siltstone that has small but finite porosity. -from Authors

  7. Odds ratio for 2 × 2 tables: Mantel-Haenszel estimator, profile likelihood, and presence of surrogate responses.

    PubMed

    Banerjee, Buddhananda; Biswas, Atanu

    2014-01-01

    Use of surrogate outcome to improve the inference in biomedical problems is an area of growing interest. Here, we consider a setup where both the true and surrogate endpoints are binary and we observe all the surrogate endpoints along with a few true endpoints. In a two-treatment setup we study the surrogate-augmented Mantel-Haenszel estimator based on observations from different groups when the group effect is present. We compare the Mantel-Haenszel estimator with the one obtained by maximizing profile likelihood in a surrogate augmented setup. We observe that the performances of these estimators are very close.

  8. Log N-log S in inconclusive

    NASA Technical Reports Server (NTRS)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  9. Condition and fate of logged forests in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M

    2006-08-22

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained.

  10. The Tumor-Log Odds of Positive Lymph Nodes-Metastasis Staging System, a Promising New Staging System for Gastric Cancer after D2 Resection in China

    PubMed Central

    Wang, Zhi-qiang; Ren, Chao; Wang, De-shen; Zhang, Dong-sheng; Luo, Hui-yan; Li, Yu-hong; Xu, Rui-hua

    2012-01-01

    Background In this study, we established a hypothetical tumor-lodds-metastasis (TLM) and tumor-ratio-metastasis (TRM) staging system. Moreover, we compared them with the 7th edition of American Joint Committee on Cancer tumor-nodes-metastasis (AJCC TNM) staging system in gastric cancer patients after D2 resection. Methods A total of 1000 gastric carcinoma patients receiving treatment in our center were selected for the analysis. Finally, 730 patients who received D2 resection were retrospectively studied. Patients were staged using the TLM, TRM and the 7th edition AJCC TNM system. Survival analysis was performed with a Cox regression model. We used two parameters to compare the TNM, TRM and TLM staging system, the −2log likelihood and the hazard ratio. Results The cut points of lymph node ratio (LNR) were set as 0, 0–0.3, 0.3–0.6, 0.6–1.0. And for the log odds of positive lymph nodes (LODDS), the cut points were established as≤−0.5, −0.5-0, 0-0.5, >0.5. There were significant differences in survival among patients in different LODDS classifications for each pN or LNR groups. When stratified by the LODDS classifications, the prognosis was highly homologous between those in the according pN or LNR classifications. Multivariate analysis showed that TLM staging system was better than the TRM or TNM system for the prognostic evaluation. Conclusions The TLM system was superior to the TRM or TNM system for prognostic assessment of gastric adenocarcinoma patients after D2 resection. PMID:22348125

  11. Ulysses log 1992

    NASA Technical Reports Server (NTRS)

    Perez, Raul Garcia

    1993-01-01

    The Ulysses Log tells the story of some intriguing problems that we (=The Spacecraft Team) have encountered. Ulysses was launched on 6 Oct. 1990, and it made the fastest trip to Jupiter (8 Feb. 1992). It is presently going out of the ecliptic. This paper presents log entries from the following areas: (1) ingenious maneuvers; (2) telecommunication problems; and (3) surprises.

  12. CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    2004-01-01

    The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.

  13. Performance of the likelihood ratio difference (G2 Diff) test for detecting unidimensionality in applications of the multidimensional Rasch model.

    PubMed

    Harrell-Williams, Leigh; Wolfe, Edward W

    2014-01-01

    Previous research has investigated the influence of sample size, model misspecification, test length, ability distribution offset, and generating model on the likelihood ratio difference test in applications of item response models. This study extended that research to the evaluation of dimensionality using the multidimensional random coefficients multinomial logit model (MRCMLM). Logistic regression analysis of simulated data reveal that sample size and test length have a large effect on the capacity of the LR difference test to correctly identify unidimensionality, with shorter tests and smaller sample sizes leading to smaller Type I error rates. Higher levels of simulated misfit resulted in fewer incorrect decisions than data with no or little misfit. However, Type I error rates indicate that the likelihood ratio difference test is not suitable under any of the simulated conditions for evaluating dimensionality in applications of the MRCMLM.

  14. Inductive resistivity logging in steel-cased boreholes. SBIR Phase 2 progress report

    SciTech Connect

    Wilt, M.

    1999-07-01

    SBIR Phase 2 project 40145-97-I calls for the design and construction of a prototype inductive logging device to measure formation resistivity from within a steel-cased borehole. The SCIL (Steel Casing Induction Logger) tool is intended for reservoir characterization and process monitoring in an oil field environment. This report summarizes findings from the initial project period. In this phase, bench model measurements were made to test casing compensation schemes, numerical models were calculated to optimize the tool configuration and associated formation sensitivity and the preliminary design of the tool was completed. The bench tests constitute fundamental research on determining the characteristics of steel well casing and on developing means of separating the effects of the casing and the formation. This technology is crucial to the success of the project and significant progress has been made towards the goal of recovering the formation resistivity from inside the casing. Next, a series of sensitivity and tool configuration studies have been completed through partner Dr. David Alumbaugh at Sandia National Laboratories. These numerical results help to optimize the tool configuration and allow one to calculate the expected formation sensitivity. These models are preliminary to data interpretation software to be developed in the next project period. The initial hardware design of the tool has been completed, and ordering parts has begun for later manufacture and assembly. The tool, which is designed for maximum flexibility of deployment, will have a powerful transmitter, an array of three component sensors and sufficient dynamic range to operate in standard oil field steel-cased boreholes.

  15. Perceiving cancer-risks and heredity-likelihood in genetic-counseling: how counselees recall and interpret BRCA 1/2-test results.

    PubMed

    Vos, J; Oosterwijk, J C; Gómez-García, E; Menko, F H; Jansen, A M; Stoel, R D; van Asperen, C J; Tibben, A; Stiggelbout, A M

    2011-03-01

    Previous studies on the counsellees' perception of DNA test results did not clarify whether counsellees were asked about their recollections or interpretations, and focused only on patients' own risks and not on the likelihood that cancer is heritable in the family. We tested differences and correlations of four perception aspects: recollections and interpretations of both cancer risks and heredity likelihood. In a retrospective study, women tested for BRCA1/2 on average, 5 years ago, completed questionnaires about their perception. Participants had received an unclassified variant (n = 76), uninformative (n = 76) or pathogenic mutation (n = 51) result in BRCA1/2. Analyses included t-tests, correlations and structural equation modelling. The counsellees' perception showed to consist of four distinctive phenomena: recollections and interpretations of cancer risks and of heredity likelihood. This distinctiveness was suggested by significant differences between these perception variables. Moderate to strong correlations were found between these variables, suggesting that these differences between variables were consistent. The relationships between these variables were not influenced by actually communicated DNA test results, sociodemographics, medical and pedigree information, or framing of cancer risk questions. The largest differences between recollections and interpretations were found in the unclassified variant group and the smallest in uninformatives. Cancer risks and heredity likelihood correlated least in the pathogenic mutation group. Communication of ambiguous genetic information enlarged the differences. To understand the counsellees' perception of genetic counselling, researchers should study recollections and interpretations of cancer risks and heredity likelihood. Genetic counsellors should explicitly address the counsellees' recollections and interpretations, and be aware of possible inaccuracies.

  16. Electoral Outcomes and Social Log-Likelihood Maxima

    DTIC Science & Technology

    1979-10-01

    the form mXf(x; e kSk=l f(x; 0k ) + f(y*; e k) g(ek) " "(1k) where (i) Il..",I os a partition of x (ii) ek E Ik , and (iii) m k To is the Lebesgue ...Footnotes 1/ By Corollary 5.9 in Bartle [1966] and the regularity conditions on the f(x" 0). 12/ By Corollary 5.9 in Bartle [1966] and the regularity...and H. Uzawa (eds.) [1950], Studies in Linear *and Nonlinear Programming, Stanford, California: Stanford Univer- sity Press. Bartle , R. [1966], The

  17. A new high-precision borehole-temperature logging system used at GISP2, Greenland, and Taylor Dome, Antarctica

    USGS Publications Warehouse

    Clow, G.D.; Saltus, R.W.; Waddington, E.D.

    1996-01-01

    We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.

  18. Addendum 2: Logs of monitor wells drilled May 1988 through December 1992

    SciTech Connect

    Stout, J.; Qualheim, B.; McPherrin, R.; Barber, K.; Hedegaard, R.; McConihe, W.; Miller, T.

    1993-11-01

    The logs in this addendum were plotted in a new format by the same software package (LOGGER by Rockware, Denver, CO) that was used in the original publication. The scale remains the same, 1 inch = 15 foot. The header is totally automated with a subheading indexing the well-construction symbols. Geophysical curves are labeled in their respective channels, and percentage core recovery is plotted in a histogram. Lithologic symbols are plotted to scale in a channel similar to previous logs. The lithologic description also has been automated to assure consistency in terminology. Descriptions are more extensive and are referenced by leader lines to the lithologic symbol. Additional figures included for this Addendum are: a plot of all the monitoring well locations at the LLNL Main site and a plot detailing the gasoline spill area well locations in the vicinity of Building 403.

  19. Multifactorial Likelihood Assessment of BRCA1 and BRCA2 Missense Variants Confirms That BRCA1:c.122A>G(p.His41Arg) Is a Pathogenic Mutation

    PubMed Central

    Whiley, Phillip J.; Parsons, Michael T.; Leary, Jennifer; Tucker, Kathy; Warwick, Linda; Dopita, Belinda; Thorne, Heather; Lakhani, Sunil R.; Goldgar, David E.; Brown, Melissa A.; Spurdle, Amanda B.

    2014-01-01

    Rare exonic, non-truncating variants in known cancer susceptibility genes such as BRCA1 and BRCA2 are problematic for genetic counseling and clinical management of relevant families. This study used multifactorial likelihood analysis and/or bioinformatically-directed mRNA assays to assess pathogenicity of 19 BRCA1 or BRCA2 variants identified following patient referral to clinical genetic services. Two variants were considered to be pathogenic (Class 5). BRCA1:c.4484G> C(p.Arg1495Thr) was shown to result in aberrant mRNA transcripts predicted to encode truncated proteins. The BRCA1:c.122A>G(p.His41Arg) RING-domain variant was found from multifactorial likelihood analysis to have a posterior probability of pathogenicity of 0.995, a result consistent with existing protein functional assay data indicating lost BARD1 binding and ubiquitin ligase activity. Of the remaining variants, seven were determined to be not clinically significant (Class 1), nine were likely not pathogenic (Class 2), and one was uncertain (Class 3).These results have implications for genetic counseling and medical management of families carrying these specific variants. They also provide additional multifactorial likelihood variant classifications as reference to evaluate the sensitivity and specificity of bioinformatic prediction tools and/or functional assay data in future studies. PMID:24489791

  20. Maximum Likelihood and Bayesian Parameter Estimation in Item Response Theory.

    ERIC Educational Resources Information Center

    Lord, Frederic M.

    There are currently three main approaches to parameter estimation in item response theory (IRT): (1) joint maximum likelihood, exemplified by LOGIST, yielding maximum likelihood estimates; (2) marginal maximum likelihood, exemplified by BILOG, yielding maximum likelihood estimates of item parameters (ability parameters can be estimated…

  1. A per-pixel Log2ADC for high dynamic range, 1000FPS digital focal plane arrays (DFPA)

    NASA Astrophysics Data System (ADS)

    Petilli, Eugene

    2016-09-01

    Intrinsix has developed a Digital Focal Plane Array (DFPA) architecture based on a novel piecewise linear Log2 ADC (LADC) with "lossless" analog compression which enables ultra-high dynamic range ROICs that use less power than other extended dynamic range technologies. The LADC provides dynamic range of 126dB with a constant 75dB SNR over the entire frame. The companding 13bit mantissa, 3bit radix per pixel LADCs compress the 21bit signals into efficient 16 bit data words. The Read Out IC (ROIC) is compatible with most IR and LWIR detectors including two-color SLS (photodiode) and uBolometers. The DFPA architecture leverages two (staggered frame prime and redundant) MIPI CSI-3 interfaces to achieve full HD DFPA at 1000 frames/sec; an equivalent uncompressed data rate of 100Gb/sec. The LADC uses direct injection into a moderate sized integrating capacitor and several comparators create a stream of multi-bit data values. These values are accumulated in an SRAM based log2ALU and the radix of the ALU is combined with the data to generate a feedback current to the integrating capacitor, closing the delta loop. The integration time and a single pole low pass IIR filter are configurable using control signals to the log2ALU. The feedback current is at least partially generated using PWM for high linearity.

  2. Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data

    USGS Publications Warehouse

    King, K.C.

    1979-01-01

    The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well.  The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map).  Lithologic data are summarized from the sample descriptions of Smith and others (1976).  Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool.  Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.

  3. Grid Logging: Best Practices Guide

    SciTech Connect

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  4. WELLOG: computer software system for analyzing and plotting well log data (a user's guide to WELLOG. REV2)

    SciTech Connect

    Atwood, J.W.; Killpack, T.J.; Glenn, W.E.; Nutter, C.

    1980-11-01

    WELLOG is a software system that has been developed to plot digitized well log data in a manner suitable for analysis. Multiple logs can be plotted side by side for correlation analysis, and up to three logs can be plotted on a cross plot. Data entry, editing, and modification functions are also provided by the program. Digitizing is accomplished by a TEKTRONIX 4954 (on-line) digitizing tablet, and plotting is done on a TEKTRONIX 4014 graphics terminal, a STATOS 42 electrostatic plotter, or a CALCOMP pen plotter using a device independent plotting system. This program (WELLOG.REV2) is not as system-dependent as the former version (WELLOG.REV1). The user must supply a program to digitize the data and supply subroutines to interface the program with file manipulation and plotting routines of their system. One major improvement is the use of an on-line digitizing system whereby the program accesses disk files rather than reading the data from tape. In REV2 the merge file has been automated such that the file is initialized automatically upon creation and also delete protected. The randomly spaced data capabilities have been greatly improved allowing the averaging and cross plotting of the data. Routines have been added which allow all of the cross plots excepting the Z-plot to be printed on a line printer. Dresser Atlas' A-K plot has also been added. The program is almost completely self-contained needing only a few interfacing and system subroutines.

  5. EthoLog 2.2: a tool for the transcription and timing of behavior observation sessions.

    PubMed

    Ottoni, E B

    2000-08-01

    EthoLog is a tool that aids in the transcription and timing of behavior observation sessions--experimental or naturalistic, from video/audio tapes or registering real time. It was created with Visual Basic and runs on Windows (3.x/9x). The user types the key codes for the predefined behavioral categories, and EthoLog registers their sequence and timing and saves the resulting data in ASCII output files. A sequential analysis matrix can be generated from the sequential data. The output files may be edited, converted to plain text files for printing, or exported to a spreadsheet program, such as MS Excel, for further analyses.

  6. Detailed evaluation of gas hydrate reservoir properties using JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well downhole well-log displays

    USGS Publications Warehouse

    Collett, T.S.

    1999-01-01

    The JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well project was designed to investigate the occurrence of in situ natural gas hydrate in the Mallik area of the Mackenzie Delta of Canada. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas-hydrate-bearing sediments. Downhole logging tool strings deployed in the Mallik 2L-38 well included the Schlumberger Platform Express with a high resolution laterolog, Array Induction Imager Tool, Dipole Shear Sonic Imager, and a Fullbore Formation Microlmager. The downhole log data obtained from the log- and core-inferred gas-hydrate-bearing sedimentary interval (897.25-1109.5 m log depth) in the Mallik 2L-38 well is depicted in a series of well displays. Also shown are numerous reservoir parameters, including gas hydrate saturation and sediment porosity log traces, calculated from available downhole well-log and core data. The gas hydrate accumulation delineated by the Mallik 2L-38 well has been determined to contain as much as 4.15109 m3 of gas in the 1 km2 area surrounding the drill site.

  7. Maximum Likelihood Fusion Model

    DTIC Science & Technology

    2014-08-09

    data fusion, hypothesis testing,maximum likelihood estimation, mobile robot navigation REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT...61 vi 9 Bibliography 62 vii 10 LIST OF FIGURES 1.1 Illustration of mobile robotic agents. Land rovers such as (left) Pioneer robots ...simultaneous localization and mapping 1 15 Figure 1.1: Illustration of mobile robotic agents. Land rovers such as (left) Pioneer robots , (center) Segways

  8. List-mode likelihood

    PubMed Central

    Barrett, Harrison H.; White, Timothy; Parra, Lucas C.

    2010-01-01

    As photon-counting imaging systems become more complex, there is a trend toward measuring more attributes of each individual event. In various imaging systems the attributes can include several position variables, time variables, and energies. If more than about four attributes are measured for each event, it is not practical to record the data in an image matrix. Instead it is more efficient to use a simple list where every attribute is stored for every event. It is the purpose of this paper to discuss the concept of likelihood for such list-mode data. We present expressions for list-mode likelihood with an arbitrary number of attributes per photon and for both preset counts and preset time. Maximization of this likelihood can lead to a practical reconstruction algorithm with list-mode data, but that aspect is covered in a separate paper [IEEE Trans. Med. Imaging (to be published)]. An expression for lesion detectability for list-mode data is also derived and compared with the corresponding expression for conventional binned data. PMID:9379247

  9. An O(log sup 2 N) parallel algorithm for computing the eigenvalues of a symmetric tridiagonal matrix

    NASA Technical Reports Server (NTRS)

    Swarztrauber, Paul N.

    1989-01-01

    An O(log sup 2 N) parallel algorithm is presented for computing the eigenvalues of a symmetric tridiagonal matrix using a parallel algorithm for computing the zeros of the characteristic polynomial. The method is based on a quadratic recurrence in which the characteristic polynomial is constructed on a binary tree from polynomials whose degree doubles at each level. Intervals that contain exactly one zero are determined by the zeros of polynomials at the previous level which ensures that different processors compute different zeros. The exact behavior of the polynomials at the interval endpoints is used to eliminate the usual problems induced by finite precision arithmetic.

  10. SPT Lensing Likelihood: South Pole Telescope CMB lensing likelihood code

    NASA Astrophysics Data System (ADS)

    Feeney, Stephen M.; Peiris, Hiranya V.; Verde, Licia

    2014-11-01

    The SPT lensing likelihood code, written in Fortran90, performs a Gaussian likelihood based upon the lensing potential power spectrum using a file from CAMB (ascl:1102.026) which contains the normalization required to get the power spectrum that the likelihood call is expecting.

  11. DALI: Derivative Approximation for LIkelihoods

    NASA Astrophysics Data System (ADS)

    Sellentin, Elena

    2015-07-01

    DALI (Derivative Approximation for LIkelihoods) is a fast approximation of non-Gaussian likelihoods. It extends the Fisher Matrix in a straightforward way and allows for a wider range of posterior shapes. The code is written in C/C++.

  12. Turbo Equalization Scheme between Partial Response Maximum Likelihood Detector and Viterbi Decoder for 2/4 Modulation Code in Holographic Data Storage Systems

    NASA Astrophysics Data System (ADS)

    Kong, Gyuyeol; Choi, Sooyong

    2012-08-01

    A turbo equalization scheme for holographic data storage (HDS) systems is proposed. The proposed turbo equalization procedure is conducted between a one-dimensional (1D) partial response maximum likelihood (PRML) detector and the joint Viterbi decoder by exchanging a priori and extrinsic information. In the joint Viterbi decoder, the modulation and convolutional decoding is performed simultaneously by mapping a 2/4 modulation symbol onto the trellis of the convolutional code to reduce the complexity of the decoding procedure and improve the decoding capability for the iterative equalization and decoding. In addition, since the channel model is described as the two-dimensional convolution in HDS systems, the 1D PRML detector is performed in the vertical direction and the joint Viterbi decoder is performed in the horizontal direction to maximize the performance gains. The simulation result shows that the proposed turbo equalization scheme has the better bit error rate performances as the number of iterations increases.

  13. Real Time Mud Gas Logging During Drilling of DFDP-2B

    NASA Astrophysics Data System (ADS)

    Mathewson, L. A.; Toy, V.; Menzies, C. D.; Zimmer, M.; Erzinger, J.; Niedermann, S.; Cox, S.

    2015-12-01

    The Deep Fault Drilling Project (DFDP) aims to improve our understanding of the Alpine Fault Zone, a tectonically active mature fault system in New Zealand known to rupture in large events, by deep scientific drilling. The borehole DFDP-2B approached the Alpine Fault at depth, reaching a final depth of 892 m (820 m true vertical depth). Online gas analysis (OLGA) while drilling tracked changes in the composition of gases extracted from the circulating drill mud. The composition of fluids from fault zones can provide information about their origins, flow rates and -paths, fluid-rock interactions along these paths, and the permeability structure of the faulted rock mass. Apart from an atmospheric input, the gases in drilling mud derive from the pore space of rock, crushed at the drill bit, and from permeable layers intersected by the borehole. The rapid formation of mud wall cake seals the borehole from further fluid inflow, hence formation-derived gases enter mostly at the depth of the drill bit. OLGA analyses N2, O2, Ar, CO2, CH4, He, and H2 on a mass spectrometer, hydrocarbons CH4, C2H6, C3H8, i-C4H10, and n-C4H10 on a gas chromatograph, and Rn using a lucas-cell detector. Gas was sampled for offline analyses on noble gas and stable isotopes to complement the OLGA dataset. The principle formation-derived gases found in drilling mud during drilling of DFDP-2 were CO2 and CH4, with smaller component of H2 and He2. High radon activity is interpreted to reflect intervals of active fluid flow through highly fractured and faulted rock. 3He/4He values in many samples were extremely air-contaminated, i.e. there was almost no excess of non-atmospheric He. The 3He/4He values measured at 236 m and 610 m, which are the only analyses with uncertainties <100%, are very similar to those measured in hot springs along the Alpine Fault, e.g. Fox River (0.64 Ra), Copland (0.42 Ra), Lower Wanganui (0.81 Ra). We will compare these data to those gathered using OLGA and discuss the

  14. Global market integration increases likelihood that a future African Green Revolution could increase crop land use and CO2 emissions

    PubMed Central

    Hertel, Thomas W.; Ramankutty, Navin; Baldos, Uris Lantz C.

    2014-01-01

    There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing. PMID:25201962

  15. Global market integration increases likelihood that a future African Green Revolution could increase crop land use and CO2 emissions.

    PubMed

    Hertel, Thomas W; Ramankutty, Navin; Baldos, Uris Lantz C

    2014-09-23

    There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing.

  16. Log response of ultrasonic imaging and its significance for deep mineral prospecting of scientific drilling borehole-2 in Nanling district, China

    NASA Astrophysics Data System (ADS)

    Xiao, Kun; Zou, Changchun; Xiang, Biao; Yue, Xuyuan; Zhou, Xinpeng; Li, Jianguo; Zhao, Bin

    2014-10-01

    The hole NLSD-2, one of the deepest scientific drilling projects in the metallic ore districts of China, is the second scientific drilling deep hole in the Nanling district. Its ultimate depth is 2012.12 m. This hole was created through the implementation of continuous coring, and the measuring of a variety of geophysical well logging methods was performed over the course of the drilling process. This paper analyzes the characteristic responses of the fracture and fractured zone by ultrasonic imaging log data, and characterizes various rules of fracture parameters which change according to drilling depth. It then discusses the denotative meaning of the log results of polymetallic mineralization layers. The formation fractures develop most readily in a depth of 100~200 m, 600~850 m and 1450~1550 m of the hole NLSD-2, and high angle fractures develop most prominently. The strike direction of the fractures is mainly NW-SE, reflecting the orientation of maximum horizontal principal stress. For the polymetallic mineralization layer that occurred in the fractured zone, the characteristic response of ultrasonic imaging log is a wide dark zone, and the characteristic responses of conventional logs displayed high polarizability, high density, high acoustic velocity and low resistivity. All the main polymetallic mineralization layers are developed in fractures or fractured zones, and the fractures and fractured zones can be identified by an ultrasonic imaging log, thus the log results indirectly indicate the occurrence of polymetallic mineralization layers. Additionally, the relationship between the dip direction of fractures and the well deviation provides guidance for straightening of the drilling hole.

  17. SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping

    NASA Technical Reports Server (NTRS)

    Cowart, Hugh S.; Scott, David W.

    2014-01-01

    A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.

  18. Discriminative likelihood score weighting based on acoustic-phonetic classification for speaker identification

    NASA Astrophysics Data System (ADS)

    Suh, Youngjoo; Kim, Hoirin

    2014-12-01

    In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.

  19. Revision to CDC's Zika Travel Notices: Minimal Likelihood for Mosquito-Borne Zika Virus Transmission at Elevations Above 2,000 Meters.

    PubMed

    Cetron, Martin

    2016-03-18

    Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission.

  20. Identification of electrofacies on the basis of well logging to determine sedimentation environment of horizon JK2 in Em-Egovskoe field (Western Siberia)

    NASA Astrophysics Data System (ADS)

    Kudryashova, L.; Belozerov, V.; Pushtakov, E.

    2015-11-01

    Well logging results are one of the ways to study the buried terrigenous rocks genesis. To ensure the most objective characterization of the rock and identification of electrofacies it is necessary to use a complex geological and geophysical survey. The comprehensive investigations of environmental conditions based on well logging have been performed for the horizon JK2 of Tumenskoe formation in Em-Egovskoe area, Krasnoleninskoe field (Western Siberia). The defined electrofacies were compared with the results of earlier conducted granulometric and mineralogical analyses. The totality of research provided for a conclusion that the investigated sediments of horizon JK2 had been formed within the destructive tidal delta. Thus, objective facies prediction can only be ensured by analyzing core and well logging data comprehensively.

  1. Profile Likelihood and Incomplete Data.

    PubMed

    Zhang, Zhiwei

    2010-04-01

    According to the law of likelihood, statistical evidence is represented by likelihood functions and its strength measured by likelihood ratios. This point of view has led to a likelihood paradigm for interpreting statistical evidence, which carefully distinguishes evidence about a parameter from error probabilities and personal belief. Like other paradigms of statistics, the likelihood paradigm faces challenges when data are observed incompletely, due to non-response or censoring, for instance. Standard methods to generate likelihood functions in such circumstances generally require assumptions about the mechanism that governs the incomplete observation of data, assumptions that usually rely on external information and cannot be validated with the observed data. Without reliable external information, the use of untestable assumptions driven by convenience could potentially compromise the interpretability of the resulting likelihood as an objective representation of the observed evidence. This paper proposes a profile likelihood approach for representing and interpreting statistical evidence with incomplete data without imposing untestable assumptions. The proposed approach is based on partial identification and is illustrated with several statistical problems involving missing data or censored data. Numerical examples based on real data are presented to demonstrate the feasibility of the approach.

  2. Two-step intermediate fine mapping with likelihood ratio test statistics: applications to Problems 2 and 3 data of GAW15.

    PubMed

    Sinha, Ritwik; Luo, Yuqun

    2007-01-01

    Construction of precise confidence sets of disease gene locations after initial identification of linked regions can improve the efficiency of the ensuing fine mapping effort. We took the confidence set inference, a framework proposed and implemented using the Mean test statistic (CSI-Mean) and improved the efficiency substantially by using a likelihood ratio test statistic (CSI-MLS). The CSI framework requires knowledge of some disease-model-related parameters. In the absence of prior knowledge of these parameters, a two-step procedure may be employed: 1) the parameters are estimated using a coarse map of markers; 2) CSI-Mean or CSI-MLS are applied to construct the confidence sets of the disease gene locations using a finer map of markers, assuming the estimates from Step 1 for the required parameters. In this article we show that the advantages of CSI-MLS over CSI-Mean, previously demonstrated when the required parameters are known, are preserved in this two-step procedure, using both the simulated and real data contributed to Problems 2 and 3 of Genetic Analysis Workshop 15. In addition, our result suggests that microsatellite data, when available, should be used in Step 1. Also explored in detail is the effect of the absence of parental genotypes on the performance of CSI-MLS.

  3. Morphological and quantitative changes in mitochondria, plastids, and peroxisomes during the log-to-stationary transition of the growth phase in cultured tobacco BY-2 cells

    PubMed Central

    Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken

    2016-01-01

    ABSTRACT We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065

  4. Geological well log analysis. Third ed

    SciTech Connect

    Pirson, S.J.

    1983-01-01

    Until recently, well logs have mainly been used for correlation, structural mapping, and quantitive evaluation of hydrocarbon bearing formations. This third edition of Geologic Well Log Analysis, however, describes how well logs can be used for geological studies and mineral exploration. This is done by analyzing well logs for numerous parameters and indices of significant mineral accumulation, primarily in sediments. Contents are: SP and Eh curves as redoxomorphic logs; sedimentalogical studies by log curve shapes; exploration for stratigraphic traps; continuous dipmeter as a structural tool; continuous dipmeter as a sedimentation tool; Paleo-facies logging and mapping; hydrogeology 1--hydrodynamics of compaction; hydrogeology 2--geostatic equilibrium; and hydrogeology 3--hydrodynamics of infiltration. Appendixes cover: Computer program for calculating the dip magnitude, azimuth, and the degree and orientation of the resistivity anisotrophy; a lithology computer program for calculating the curvature of a structure; and basic log analysis package for HP-41CV programmable calculator.

  5. Binary Logistic Regression Analysis for Detecting Differential Item Functioning: Effectiveness of R[superscript 2] and Delta Log Odds Ratio Effect Size Measures

    ERIC Educational Resources Information Center

    Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.

    2014-01-01

    The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…

  6. Log processing systems

    SciTech Connect

    Bowlin, W.P.; Kneer, M.P.; Ballance, J.D.

    1989-11-07

    This patent describes an improvement in a computer controlled processing system for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.

  7. Porosity Log Prediction Using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier

    2016-08-01

    Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.

  8. Multiple log potash assay

    NASA Astrophysics Data System (ADS)

    Hill, D. G.

    1993-10-01

    A five-mineral multiple-log potash assay technique has been successfully applied to evaluate potash-rich intervals in evaporite sequences. The technique is able to distinguish economic potash minerals from non-economic potash minerals and from other non-potash radioactive minerals. It can be applied on location, using a programmable calculator or microcomputer, providing near real-time logs of potash mineral concentrations. Log assay values show good agreement with core wet chemistry analyses.

  9. Selective Logging, Fire, and Biomass in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  10. Bayesian computation via empirical likelihood

    PubMed Central

    Mengersen, Kerrie L.; Pudlo, Pierre; Robert, Christian P.

    2013-01-01

    Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models. PMID:23297233

  11. On the likelihood of forests

    NASA Astrophysics Data System (ADS)

    Shang, Yilun

    2016-08-01

    How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.

  12. First native silica findings in bismuth from garnet skarns of Ribny Log - 2 gold ore target in Topolninsk ore deposit (Gorny Altai)

    NASA Astrophysics Data System (ADS)

    Cherkasova, T.; Timkin, T.; Savinova, O.

    2015-02-01

    The nanomineralogic investigation results of ore minerals in metasomatites (garnet skarns) of Ribny Log- 2 gold ore in Topolninsk ore deposit (Gorny Altai) revealed the native silica impurities (Si) of 1 - 5 nm within the grains of native bismuth (Bi). Polished sections were examined by using Tescan Vega 3 scanning electron microscope (SEM) with Oxford energy-dispersive spectrometer at the Department of Geology and Mineral Exploration, Institute of Natural Resources, Tomsk Polytechnic University.

  13. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  14. Well log and 2D seismic data character of the Wilcox Group in south-central Louisiana

    USGS Publications Warehouse

    Enomoto, Catherine B.

    2014-01-01

    The Wilcox Group is productive in updip areas of Texas and Louisiana from fluvial, deltaic, and near-shore marine shelf sandstones. The reported presence of porous sandstones at 29,000 feet within the Wilcox Group containing about 200 feet of gas in the Davy Jones 1 discovery well in the offshore Louisiana South Marsh Island area illustrates a sand-rich system developed during the Paleocene and early Eocene. This study describes some of the well log and reflection seismic data characteristics of the slope and basin-floor reservoirs with gas-discovery potential that may be in the area between the producing trend onshore Louisiana and the offshore discovery.

  15. Measurement of the top quark mass with the dynamical likelihood method using lepton plus jets events with b-tags in p anti-p collisions at s**(1/2) = 1.96-TeV

    SciTech Connect

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst. Phys. /Argonne /Barcelona, IFAE /Baylor U. /INFN, Bologna /Bologna U. /Brandeis U. /UC, Davis /UCLA /UC, San Diego /UC, Santa Barbara

    2005-12-01

    This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.

  16. The Dud-Alternative Effect in Likelihood Judgment

    ERIC Educational Resources Information Center

    Windschitl, Paul D.; Chambers, John R.

    2004-01-01

    The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…

  17. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  18. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2

    NASA Astrophysics Data System (ADS)

    Scandurra, D.; Albertini, F.; van der Meer, R.; Meier, G.; Weber, D. C.; Bolsi, A.; Lomax, A.

    2016-02-01

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/-  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.

  19. The Alpine Fault Hanging Wall Viewed from Within: Structural and Lithological Analysis of Acoustic Televiewer Logs in the DFDP-2B Borehole, New Zealand

    NASA Astrophysics Data System (ADS)

    Massiot, C.; Doan, M. L.; Celerier, B. P.; Mcnamara, D. D.; Little, T. A.; Schmitt, D. R.; Pezard, P. A.; Remaud, L.; Townend, J.; Sutherland, R.; Toy, V.

    2015-12-01

    The Alpine Fault is the major dextral transpressive fault forming the Australia-Pacific plate boundary in the South Island of New Zealand. The DFDP-2B borehole, drilled in late 2014, reached a depth of 893 m (measured depth) and provides an opportunity to describe the protolith and structure of ductilely deformed schists in the hangingwall of this active fault. A total of 19 km of wireline logging data were acquired in the borehole, including 4.8 km of borehole televiewer (BHTV) acoustic image logs. The high resolution of the BHTV logs and the borehole's inclination, which reached a maximum of 46° from vertical near the base of the hole, provide good conditions in which to intersect the predominant foliation, which dips SE at a high angle to the hole as well as other feature sets. Here, we present a detailed structural analysis and lithological interpretation of the BHTV logs from the 264-886 m interval. The borehole's inclination and the attenuation caused by the drilling mud reduces the image quality in places. However, some zones of poor image quality also concur with indications of permeable fractures and are related to fractured zones with localized borehole wall damage. Foliation can be regularly identified to a depth of 808 m (n=406). The mean orientation of foliation planes (dip of 57° towards 145) remains quite consistent throughout the drilled interval and is similar to that inferred for the structurally underlying Alpine Fault. Larger-scale changes in image amplitude suggest lithological or petrophysical variations not yet identified on cuttings. The majority of the 1650 planar features identified as fractures are subparallel to the foliation and may represent deformed veins or other mineral segregations that contribute to the foliation's internal structure. Other planar features have dominantly NW-SE strikes, approximately orthogonal to the foliation and are inferred to be fractures. Truncations or offsets are observed on ~20 features suggesting

  20. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2.

    PubMed

    Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A

    2016-02-07

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/-  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.

  1. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  2. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  3. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  4. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  5. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  6. CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS

    EPA Science Inventory

    Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...

  7. Digital Photography as an Educational Food Logging Tool in Obese Patients with Type 2 Diabetes: Lessons Learned from A Randomized, Crossover Pilot Trial

    PubMed Central

    Ehrmann, Brett J.; Anderson, Robert M.; Piatt, Gretchen A.; Funnell, Martha M.; Rashid, Hira; Shedden, Kerby; Douyon, Liselle

    2014-01-01

    Purpose The purpose of this pilot study is to investigate the utility of, and areas of refinement for, digital photography as an educational tool for food logging in obese patients with type 2 diabetes (T2DM). Methods Thirty-three patients aged 18-70 with T2DM, BMI at least 30 kg/m2, and A1C 7.5-9% were recruited from an endocrinology clinic and randomized to a week of food logging using a digital camera (DC) or paper diary (PD), crossing over for week two. Patients then viewed a presentation about dietary effects on blood glucose, using patient DC and blood glucose entries. Outcomes of adherence (based on number of weekly entries), changes in mean blood glucose and frequency of blood glucose checks, and patient satisfaction were compared between methods. Patient feedback on the DC intervention and presentation was also analyzed. Results Thirty patients completed the study. Adherence was identical across methods. The mean difference in number of entries was not significant between methods. This difference increased and neared statistical significance (favoring DC) among patients who were adherent for at least one week (21 entries, with 2 entries per day for 5 of 7 days, n=25). Mean blood glucose did not significantly decrease in either method. Patient satisfaction was similar between interventions. Feedback indicated concerns over photograph accuracy, forgetting to use the cameras, and embarrassment using them in public. Conclusion Though comparable to PD in adherence, blood glucose changes, and patient satisfaction in this pilot trial, patient feedback suggested specific areas of refinement to maximize utility of DC-based food logging as an educational tool in T2DM. PMID:24168836

  8. Acoustic borehole logging

    SciTech Connect

    Medlin, W.L.; Manzi, S.J.

    1990-10-09

    This patent describes an acoustic borehole logging method. It comprises traversing a borehole with a borehole logging tool containing a transmitter of acoustic energy having a free-field frequency spectrum with at least one characteristic resonant frequency of vibration and spaced-apart receiver, repeatedly exciting the transmitter with a swept frequency tone burst of a duration sufficiently greater than the travel time of acoustic energy between the transmitter and the receiver to allow borehole cavity resonances to be established within the borehole cavity formed between the borehole logging tool and the borehole wall, detecting acoustic energy amplitude modulated by the borehole cavity resonances with the spaced-apart receiver, and recording an amplitude verses frequency output of the receiver in correlation with depth as a log of the borehole frequency spectrum representative of the subsurface formation comprising the borehole wall.

  9. EE-3A Logging Report

    SciTech Connect

    Anderson, David W.

    1993-12-15

    Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.

  10. 6. Log calving barn. Interior view showing log postandbeam support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  11. Oracle Log Buffer Queueing

    SciTech Connect

    Rivenes, A S

    2004-12-08

    The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.

  12. Factors Associated with Young Adults’ Pregnancy Likelihood

    PubMed Central

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  13. Experimental study of main rotor tip geometry and tail rotor interactions in hover. Volume 2: Run log and tabulated data

    NASA Technical Reports Server (NTRS)

    Balch, D. T.; Lombardi, J.

    1985-01-01

    A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.

  14. Acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1988-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.

  15. The Sherpa Maximum Likelihood Estimator

    NASA Astrophysics Data System (ADS)

    Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.

    2011-07-01

    A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.

  16. NMR logging apparatus

    DOEpatents

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  17. 4. Log chicken house (far left foreground), log bunkhouse (far ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  18. Section 9: Ground Water - Likelihood of Release

    EPA Pesticide Factsheets

    HRS training. the ground water pathway likelihood of release factor category reflects the likelihood that there has been, or will be, a release of hazardous substances in any of the aquifers underlying the site.

  19. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum

    PubMed Central

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-01-01

    “Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425

  20. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.

    PubMed

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-08-10

    "Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms.

  1. Convolutional codes. II - Maximum-likelihood decoding. III - Sequential decoding

    NASA Technical Reports Server (NTRS)

    Forney, G. D., Jr.

    1974-01-01

    Maximum-likelihood decoding is characterized as the determination of the shortest path through a topological structure called a trellis. Aspects of code structure are discussed along with questions regarding maximum-likelihood decoding on memoryless channels. A general bounding technique is introduced. The technique is used to obtain asymptotic bounds on the probability of error for maximum-likelihood decoding and list-of-2 decoding. The basic features of sequential algorithms are discussed along with a stack algorithm, questions of computational distribution, and the martingale approach to computational bounds.

  2. CosmoSlik: Cosmology sampler of likelihoods

    NASA Astrophysics Data System (ADS)

    Millea, Marius

    2017-01-01

    CosmoSlik quickly puts together, runs, and analyzes an MCMC chain for analysis of cosmological data. It is highly modular and comes with plugins for CAMB (ascl:1102.026), CLASS (ascl:1106.020), the Planck likelihood, the South Pole Telescope likelihood, other cosmological likelihoods, emcee (ascl:1303.002), and more. It offers ease-of-use, flexibility, and modularity.

  3. Logs Perl Module

    SciTech Connect

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  4. Log of Apollo 11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  5. Alaska's Logging Camp School.

    ERIC Educational Resources Information Center

    Millward, Robert E.

    1999-01-01

    A visit to Ketchikan, Alaska, reveals a floating, one-teacher logging-camp school that uses multiage grouping and interdisciplinary teaching. There are 10 students. The school gym and playground, bunkhouse, fuel tanks, mess hall, and students' homes bob up and down and are often moved to other sites. (MLH)

  6. Logging on to Learn

    ERIC Educational Resources Information Center

    Butler, Kevin

    2010-01-01

    A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

  7. Interactive Reflective Logs

    ERIC Educational Resources Information Center

    Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina

    2010-01-01

    The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

  8. Petrographic image logging system

    SciTech Connect

    Payne, C.J.; Ulrich, M.R.; Maxwell, G.B. ); Adams, J.P. )

    1991-03-01

    The Petrographic Image Logging System (PILS) is a logging system data base for Macintosh computers that allows the merging of traditional wire-line, core, and mud log data with petrographic images. The system is flexible; it allows the user to record, manipulate, and display almost any type of character, graphic, and image information. Character and graphic data are linked and entry in either mode automatically generates the alternate mode. Character/graphic data may include such items as ROP, wire-line log data, interpreted lithologies, ditch cutting lith-percentages, porosity grade and type, grain size, core/DST information, and sample descriptions. Image data may include petrographic and SEM images of cuttings, core, and thin sections. All data are tied to depth. Data are entered quickly and easily in an interactive manner with a mouse, keyboard, and digitizing tablet or may be imported and immediately autoplotted from a variety of environments via modem, network, or removable disk. Color log displays, including petrographic images, are easily available on CRT or as hardcopy. The system consists of a petrographic microscope, video camera, Macintosh computer, video framegrabber and digitizing tablet. Hardcopy is scaleable and can be generated by a variety of color printing devices. The software is written in Supertalk, a color superset of the standard Apple Hypercard programming language, hypertalk. This system is being tested by Mobil in the lab and at the well site. Implementation has provided near 'real-time' core and cuttings images from drilling wells to the geologist back at the office.

  9. The fluid-compensated cement bond log

    SciTech Connect

    Nayfeh, T.H.; Leslie, H.D.; Wheelis, W.B.

    1984-09-01

    An experimental and numerical wave mechanics study of cement bond logs demonstrated that wellsite computer processing can now segregate wellbore fluid effects from the sonic signal response to changing cement strength. Traditionally, cement logs have been interpreted as if water were in the wellbore, without consideration of wellbore fluid effects. These effects were assumed to be negligible. However, with the increasing number of logs being run in completion fluids such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/, large variations in cement bond logs became apparent. A Schlumberger internal paper showing that bond log amplitude is related to the acoustic impedance of the fluid in which the tool is run led to a comprehensive study of wellbore fluid effects. Numerical and experimental models were developed simulating wellbore geometry. Measurements were conducted in 5-, 7-, and 95/8-in. casings by varying the wellbore fluid densities, viscosities, and fluid types (acoustic impedance). Parallel numerical modeling was undertaken using similar parameters. The results showed that the bond log amplitude varied dramatically with the wellbore fluid's acoustic impedance; for example, there was a 70 percent increase in the signal amplitude for 11.5-lb/ gal CaCl/sub 2/ over the signal amplitude in water. This led to the development of a Fluid-Compensated Bond log that corrects the amplitude for acoustic impedance of varying wellbore fluids, thereby making the measurements more directly related to the cement quality.

  10. Comparison between Pre-log and Post-log Statistical Models in Ultra-Low-Dose CT Reconstruction.

    PubMed

    Fu, Lin; Lee, Tzu-Cheng; Kim, Soo Mee; Alessio, Adam; Kinahan, Paul; Chang, Zhiqian; Sauer, Ken; Kalra, Mannudeep; De Man, Bruno

    2016-11-09

    X-ray detectors in clinical computed tomography (CT) usually operate in current-integrating mode. Their complicated signal statistics often lead to intractable likelihood functions for practical use in model-based image reconstruction (MBIR). It is therefore desirable to design simplified statistical models without losing the essential factors. Depending on whether the CT transmission data are logarithmically transformed, pre-log and post-log models are two major categories of choices in CT MBIR. Both being approximations, it remains an open question whether one model can notably improve image quality over the other on real scanners. In this study, we develop and compare several pre-log and post-log MBIR algorithms under a unified framework. Their reconstruction accuracy based on simulation and clinical datasets are evaluated. The results show that pre-log MBIR can achieve notably better quantitative accuracy than post-log MBIR in ultra-low-dose CT, although in less extreme cases, post-log MBIR with handcrafted pre-processing remains a competitive alternative. Pre-log MBIR could play a growing role in emerging ultra-low-dose CT applications.

  11. Log-Concavity and Strong Log-Concavity: a review

    PubMed Central

    Saumard, Adrien; Wellner, Jon A.

    2016-01-01

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693

  12. Log-Concavity and Strong Log-Concavity: a review.

    PubMed

    Saumard, Adrien; Wellner, Jon A

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning.

  13. Maximum likelihood density modification by pattern recognition of structural motifs

    DOEpatents

    Terwilliger, Thomas C.

    2004-04-13

    An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.

  14. Feasibility studies of a carbon/oxygen logging tool for evaluating the CO/sub 2/ content of the medium in nuclear device containment

    SciTech Connect

    West, H.I. Jr.; Glasgow, J.E.

    1983-12-01

    The feasibility of using oil-well bore-hole logging techniques for assaying the carbonate content of a potential shot site has been investigated. The procedure makes use of the detection of the 4439-keV ..gamma.. ray of carbon and the 6130-keV ..gamma.. ray of oxygen produced by the inelastic scattering of 14-MeV neutrons in the bore-hole medium. For the needs of the containment program, a sensitivity of detection of CO/sub 2/ to less than or equal to 0.02 weight fraction must be achieved. Laboratory measurements indicate that only with considerable development effort could a tool be made that would come close to achieving this goal.

  15. Maximum Likelihood Analysis in the PEN Experiment

    NASA Astrophysics Data System (ADS)

    Lehman, Martin

    2013-10-01

    The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.

  16. The Arabidopsis RING E3 Ubiquitin Ligase AtAIRP3/LOG2 Participates in Positive Regulation of High-Salt and Drought Stress Responses1[C][W][OA

    PubMed Central

    Kim, Jong Hum; Kim, Woo Taek

    2013-01-01

    Really Interesting New Gene (RING) E3 ubiquitin ligases have been implicated in cellular responses to the stress hormone abscisic acid (ABA) as well as to environmental stresses in higher plants. Here, an ABA-insensitive RING protein3 (atairp3) loss-of-function mutant line in Arabidopsis (Arabidopsis thaliana) was isolated due to its hyposensitivity to ABA during its germination stage as compared with wild-type plants. AtAIRP3 contains a single C3HC4-type RING motif, a putative myristoylation site, and a domain associated with RING2 (DAR2) domain. Unexpectedly, AtAIRP3 was identified as LOSS OF GDU2 (LOG2), which was recently shown to participate in an amino acid export system via interaction with GLUTAMINE DUMPER1. Thus, AtAIRP3 was renamed as AtAIRP3/LOG2. Transcript levels of AtAIRP3/LOG2 were up-regulated by drought, high salinity, and ABA, suggesting a role for this factor in abiotic stress responses. The atairp3/log2-2 knockout mutant and 35S:AtAIRP3-RNAi knockdown transgenic plants displayed impaired ABA-mediated seed germination and stomata closure. Cosuppression and complementation studies further supported a positive role for AtAIRP3/LOG2 in ABA responses. Suppression of AtAIRP3/LOG2 resulted in marked hypersensitive phenotypes toward high salinity and water deficit relative to wild-type plants. These results suggest that Arabidopsis RING E3 AtAIRP3/LOG2 is a positive regulator of the ABA-mediated drought and salt stress tolerance mechanism. Using yeast (Saccharomyces cerevisiae) two-hybrid, in vitro, and in vivo immunoprecipitation, cell-free protein degradation, and in vitro ubiquitination assays, RESPONSIVE TO DEHYDRATION21 was identified as a substrate protein of AtAIRP3/LOG2. Collectively, our data suggest that AtAIRP3/LOG2 plays dual functions in ABA-mediated drought stress responses and in an amino acid export pathway in Arabidopsis. PMID:23696092

  17. 3D Reservoir Modeling of Semutang Gas Field: A lonely Gas field in Chittagong-Tripura Fold Belt, with Integrated Well Log, 2D Seismic Reflectivity and Attributes.

    NASA Astrophysics Data System (ADS)

    Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.

    2015-12-01

    Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research

  18. 78 FR 69541 - Labor Certification Process for Logging Employment and Non-H-2A Agricultural Employment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Agricultural Employment of H-2A Aliens in the United States, 75 FR 6884 (Feb. 12, 2010). The effect of... rulemaking (NPRM). 74 FR 45906 (Sept. 4, 2009). After considering comments from the public on the...

  19. Field project to obtain pressure core, wireline log, and production test data for evaluation of CO/sub 2/ flooding potential, Conoco MCA unit well No. 358, Maljamar Field, Lea County, New Mexico

    SciTech Connect

    Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.; Goodrich, J.H.; Kumar, R.M.

    1981-11-01

    This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.

  20. 12. Upstream view showing thelower log pond log chute in ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY

  1. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  2. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  3. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    NASA Astrophysics Data System (ADS)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  4. Markov chain Monte Carlo without likelihoods.

    PubMed

    Marjoram, Paul; Molitor, John; Plagnol, Vincent; Tavare, Simon

    2003-12-23

    Many stochastic simulation approaches for generating observations from a posterior distribution depend on knowing a likelihood function. However, for many complex probability models, such likelihoods are either impossible or computationally prohibitive to obtain. Here we present a Markov chain Monte Carlo method for generating observations from a posterior distribution without the use of likelihoods. It can also be used in frequentist applications, in particular for maximum-likelihood estimation. The approach is illustrated by an example of ancestral inference in population genetics. A number of open problems are highlighted in the discussion.

  5. Integrated well log and 2-D seismic data interpretation to image the subsurface stratigraphy and structure in north-eastern Bornu (Chad) basin

    NASA Astrophysics Data System (ADS)

    Isyaku, Aminu A.; Rust, Derek; Teeuw, Richard; Whitworth, Malcolm

    2016-09-01

    Structural and stratigraphic mapping within the Bornu Basin in north east Nigeria was commonly carried out using traditional field geological methods. However, such traditional approaches remain inadequate in the semi-arid region characterised by topographically flat areas and lack of continuous bedrock outcrops that are mostly concealed beneath sand cover. Previous studies in the north-eastern part of the basin carried out using ditch cuttings from few wells and disconnected seismic data were largely inadequate and the resulting stratigraphic analyses were more often generalised. This paper presents an integrated structural and stratigraphic study of the basin using combined subsurface geophysical datasets. A Combined Log Pattern (CLP) method is a well log analysis, which utilises various well log data including gamma ray, resistivity, bulk density and sonic logs to identify lithology and stratigraphic boundaries of subsurface formations. This method is applied to constrain the subsurface stratigraphy of the north-eastern part of the Bornu Basin bordering the Lake Chad. In addition to qualitative combined well log analysis, the time-depth relationship of the sonic log and seismic data was quantitatively determined by tying a well with an intersecting seismic section to validate the stratigraphic facies horizons identified. Four well log facies and their environments of deposition were characterised from the combined well log analysis of the different log types. It is discovered that the Cretaceous basement structural features controlled the deposition of overlying formations in the basin. Without intact core data, the shallower wells were discovered to have bottomed over subsurface horst features while deeper wells penetrated into the basal facies contained mainly within the grabens. Main subsurface structural lineaments in the area include NW-SE, NE-SW and NNW-SSE trending faults, which mainly formed the horst and graben features. Some stratigraphic formations

  6. Vestige: Maximum likelihood phylogenetic footprinting

    PubMed Central

    Wakefield, Matthew J; Maxwell, Peter; Huttley, Gavin A

    2005-01-01

    Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational processes, DNA repair and

  7. My Journey with Learning Logs

    ERIC Educational Resources Information Center

    Hurst, Beth

    2005-01-01

    Learning logs, or reading response logs, have long been established as an effective reading strategy that helps students learn from text (Atwell, 1987; Blough & Berman, 1991; Calkins, 1986; Commander & Smith, 1996; Kuhrt & Farris, 1990; Reed, 1988; Sanders, 1985). In this paper, the author describes her experiences using learning logs as a…

  8. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  9. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  10. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  11. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  12. Likelihood of obtaining Structured Interview of Reported Symptoms (SIRS) and SIRS-2 elevations among forensic psychiatric inpatients with screening elevations on the Miller Forensic Assessment of Symptoms Test.

    PubMed

    Glassmire, David M; Tarescavage, Anthony M; Gottfried, Emily D

    2016-12-01

    The Miller Forensic Assessment of Symptoms Test (M-FAST) was designed as a screening measure for feigned psychiatric symptoms. When M-FAST Total Scores are elevated (raw score ≥6), the test manual recommends follow-up with a more comprehensive measure of feigning, such as the widely used and researched Structured Interview of Reported Symptoms (SIRS) or the revised version of the test (SIRS-2). The purpose of the current study was to evaluate how often M-FAST screening elevations are associated with subsequent elevations on the SIRS or SIRS-2. The sample included archival data from 100 forensic psychiatric inpatients who obtained M-FAST Total Score elevations ≥6 during screening and were subsequently administered the SIRS (that was also rescored using SIRS-2 criteria). Among examinees who elevated the M-FAST over the recommended cutoff, 66.0% met standard SIRS feigning criteria, 42% met SIRS-2 criteria for feigning, and 81.0% obtained at least 1 SIRS/SIRS-2 elevation in the Probable Feigning range or higher. These results are consistent with the M-FAST manual guidelines, which support the use of the ≥6 M-FAST cutoff score to screen for potential feigning (but not as an independent marker of feigning). A higher M-FAST cutoff score of ≥16 was associated with subsequently meeting full SIRS criteria for feigning in 100.0% of protocols. Because the SIRS criteria were designed to have very low false positive rates, these findings indicate that more confident assertions about feigning can be made when elevations reach this level on the MFAST. (PsycINFO Database Record

  13. Abundance estimation from multiple photo surveys: confidence distributions and reduced likelihoods for bowhead whales off Alaska.

    PubMed

    Schweder, Tore

    2003-12-01

    Maximum likelihood estimates of abundance are obtained from repeated photographic surveys of a closed stratified population with naturally marked and unmarked individuals. Capture intensities are assumed log-linear in stratum, year, and season. In the chosen model, an approximate confidence distribution for total abundance of bowhead whales, with an accompanying likelihood reduced of nuisance parameters, is found from a parametric bootstrap experiment. The confidence distribution depends on the assumed study protocol. A confidence distribution that is exact (except for the effect of discreteness) is found by conditioning in the unstratified case without unmarked individuals.

  14. 3D GPR Imaging of Wooden Logs

    NASA Astrophysics Data System (ADS)

    Halabe, Udaya B.; Pyakurel, Sandeep

    2007-03-01

    There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.

  15. Log(s) physics results from CDF

    SciTech Connect

    Not Available

    1989-11-08

    The Collider Detector at Fermilab (CDF) is a large, azimuthally symmetric detector designed to study {bar p}p interactions at the Fermilab Tevatron Collider. Results are presented from data taken with a minimum bias trigger at {radical}s = 630 and 1800 GeV during the 1987 run. The topics include the current analysis of dn/d{eta} and some very preliminary results on short range pseudorapidity correlations and Bose-Einstein correlations. 7 refs., 5 figs., 2 tabs.

  16. Evaluation of the likelihood of a selective CHK1 inhibitor (LY2603618) to inhibit CYP2D6 with desipramine as a probe substrate in cancer patients.

    PubMed

    Hynes, Scott M; Wickremsinhe, Enaksha; Zhang, Wei; Decker, Rodney; Ott, Jennifer; Chandler, Jason; Mitchell, Malcolm

    2015-01-01

    LY2603618 is a selective inhibitor of deoxyribonucleic acid damage checkpoint kinase 1 (CHK1) and has been in development for the enhancement of chemotherapeutic agents. The study described was to assess the potential interaction between LY2603618 and cytochrome P450 isoform 2D6 (CYP2D6) substrate desipramine in patients with cancer. Before clinical investigation, in silico simulations (using Simcyp®) were conducted. An open-label, two-period, fixed-sequence study was planned in 30 patients with advanced or metastatic cancers, in which a 50 mg oral dose of desipramine was administered alone and in combination with 275 mg of LY2603618 (i.v. infusion). An interim analysis was planned after 15 patients completed both periods. Ratios of geometric least squares means (LSMs) of primary pharmacokinetic (PK) parameters and 90% repeated confidence intervals (RCIs) between desipramine plus LY2603618 and desipramine alone were calculated. Lack of an interaction was declared if the 90% RCI fell between 0.8 and 1.25. The LSM ratios (90% RCI) for areas under the plasma concentration-time curve from time zero to tlast (AUC[0-tlast]) and to infinity (AUC[0-∞]) and maximum plasma concentration (Cmax) were 1.14 (1.04, 1.25), 1.09 (0.99, 1.21) and 1.16 (1.05, 1.29). In silico simulations were predictive of clinical results. Single doses of 275 mg LY2603618 administered with 50 mg desipramine were generally well tolerated. In conclusion, no clinically significant interaction was observed between LY2603618 and desipramine in patients with cancer. In silico predictions of clinical results demonstrated that mechanistic and physiologically based PK approaches may inform clinical study design in cancer patients.

  17. Coal log pipeline: Development status of the first commercial system

    SciTech Connect

    Marrero, T.R.

    1996-12-31

    The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.

  18. Petrophysical, Structural, and Hydrogeological Characteristics of the Alpine Fault Hanging Wall Based on DFDP-2 Wireline Logging, Temperature, and Hydraulic Measurements

    NASA Astrophysics Data System (ADS)

    Townend, J.; Sutherland, R.; Toy, V.; Doan, M. L.; Celerier, B. P.; Massiot, C.; Coussens, J.; Capova, L.; Jeppson, T.

    2015-12-01

    Fault rock assemblages reflect the combined effects of interaction between deformation, stress, temperature, fluid, and chemical regimes on distinct spatial and temporal scales at various positions in the crust. Here we summarize the petrophysical, structural, and hydrogeological characteristics of the Alpine Fault's hanging wall based on measurements made during the second stage of the Deep Fault Drilling Project (DFDP-2) in late 2014. In total, more than 19 km of wireline logging data including 4.8 km of borehole televiewer imagery were acquired during the three-month drilling operation, which reached a maximum depth of 893 m (817 m true vertical depth). In conjunction with manual and automatic measurements of hydraulic parameters, on-site analysis of rock cuttings and fluid geochemistry, and post-drilling temperature measurements, the wireline data permit detailed analysis of bulk rock properties, fracturing, and fault zone structure. We focus in particular on understanding how the hydraulic conductivity of individual fractures relates to the high bulk conductivity of the hanging-wall, and thus to the transport of fluids and heat through the shallow crust.

  19. Field project to obtain pressure core, wireline log, and production test data for evaluation of CO/sub 2/ flooding potential. Texas Pacific Bennett Ranch Unit well No. 310, Wasson (San Andres) Field, Yoakum County, Texas

    SciTech Connect

    Swift, T.E.; Goodrich, J.H.; Kumar, R.M.; McCoy, R.L.; Wilhelm, M.H.; Glascock, M.R.

    1982-01-01

    The coring, logging and testing of Bennett Ranch Unit well No. 310 was a cooperative effort between Texas Pacific, owner of the well, and Gruy Federal, Inc. The requirements of the contract, which are summarized in Enclosure 1, Appendix A, include drilling and coring activities. The pressure-coring and associated logging and testing programs in selected wells are intended to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report presents detailed information on the first such project. This project demonstrates the usefulness of integrating pressure core, log and production data to realistically evaluate a reservoir for carbon dioxide flood. The engineering of tests and analysis of such experimental data requires original thinking, but the reliability of the results is higher than data derived from conventional tests.

  20. 3. Log bunkhouse (far left), log chicken house (left of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  1. Censored Median Regression and Profile Empirical Likelihood

    PubMed Central

    Subramanian, Sundarraman

    2007-01-01

    We implement profile empirical likelihood based inference for censored median regression models. Inference for any specified sub-vector is carried out by profiling out the nuisance parameters from the “plug-in” empirical likelihood ratio function proposed by Qin and Tsao. To obtain the critical value of the profile empirical likelihood ratio statistic, we first investigate its asymptotic distribution. The limiting distribution is a sum of weighted chi square distributions. Unlike for the full empirical likelihood, however, the derived asymptotic distribution has intractable covariance structure. Therefore, we employ the bootstrap to obtain the critical value, and compare the resulting confidence intervals with the ones obtained through Basawa and Koul’s minimum dispersion statistic. Furthermore, we obtain confidence intervals for the age and treatment effects in a lung cancer data set. PMID:19112527

  2. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US–Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552

  3. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    NASA Astrophysics Data System (ADS)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  4. Maximum-Likelihood Detection Of Noncoherent CPM

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  5. Quasi-likelihood for Spatial Point Processes

    PubMed Central

    Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus

    2014-01-01

    Summary Fitting regression models for intensity functions of spatial point processes is of great interest in ecological and epidemiological studies of association between spatially referenced events and geographical or environmental covariates. When Cox or cluster process models are used to accommodate clustering not accounted for by the available covariates, likelihood based inference becomes computationally cumbersome due to the complicated nature of the likelihood function and the associated score function. It is therefore of interest to consider alternative more easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation which in practise is solved numerically. The derivation of the optimal estimating function has close similarities to the derivation of quasi-likelihood for standard data sets. The approximate solution is further equivalent to a quasi-likelihood score for binary spatial data. We therefore use the term quasi-likelihood for our optimal estimating function approach. We demonstrate in a simulation study and a data example that our quasi-likelihood method for spatial point processes is both statistically and computationally efficient. PMID:26041970

  6. Prioritizing Rare Variants with Conditional Likelihood Ratios

    PubMed Central

    Li, Weili; Dobbins, Sara; Tomlinson, Ian; Houlston, Richard; Pal, Deb K.; Strug, Lisa J.

    2016-01-01

    Background Prioritizing individual rare variants within associated genes or regions often consists of an ad hoc combination of statistical and biological considerations. From the statistical perspective, rare variants are often ranked using Fisher’s exact p values, which can lead to different rankings of the same set of variants depending on whether 1- or 2-sided p values are used. Results We propose a likelihood ratio-based measure, maxLRc, for the statistical component of ranking rare variants under a case-control study design that avoids the hypothesis-testing paradigm. We prove analytically that the maxLRc is always well-defined, even when the data has zero cell counts in the 2×2 disease-variant table. Via simulation, we show that the maxLRc outperforms Fisher’s exact p values in most practical scenarios considered. Using next-generation sequence data from 27 rolandic epilepsy cases and 200 controls in a region previously shown to be linked to and associated with rolandic epilepsy, we demonstrate that rankings assigned by the maxLRc and exact p values can differ substantially. Conclusion The maxLRc provides reliable statistical prioritization of rare variants using only the observed data, avoiding the need to specify parameters associated with hypothesis testing that can result in ranking discrepancies across p value procedures; and it is applicable to common variant prioritization. PMID:25659987

  7. Likelihood analysis of the Local Group acceleration

    NASA Astrophysics Data System (ADS)

    Schmoldt, I.; Branchini, E.; Teodoro, L.; Efstathiou, G.; Frenk, C. S.; Keeble, O.; McMahon, R.; Maddox, S.; Oliver, S.; Rowan-Robinson, M.; Saunders, W.; Sutherland, W.; Tadros, H.; White, S. D. M.

    1999-04-01

    We compute the acceleration of the Local Group using 11 206 IRAS galaxies from the recently completed all-sky PSCz redshift survey. Measuring the acceleration vector in redshift space generates systematic uncertainties caused by the redshift-space distortions in the density field. We therefore assign galaxies to their real-space positions by adopting a non-parametric model for the velocity field that relies solely on the linear gravitational instability (GI) and linear biasing hypotheses. Remaining systematic contributions to the measured acceleration vector are corrected for by using PSCz mock catalogues from N-body experiments. The resulting acceleration vector points ~15 away from the CMB dipole apex, with a remarkable alignment between small- and large-scale contributions. A considerable fraction (~65 per cent) of the measured acceleration is generated within 40 h^-1 Mpc, with a non-negligible contribution from scales between 90 and 140 h^-1 Mpc, after which the acceleration amplitude seems to have converged. The local group acceleration from PSCz appears to be consistent with the one determined from the IRAS 1.2-Jy galaxy catalogue once the different contributions from shot noise have been taken into account. The results are consistent with the gravitational instability hypothesis and do not indicate any strong deviations from the linear biasing relation on large scales. A maximum-likelihood analysis of the cumulative PSCz dipole is performed within a radius of 150 h^-1 Mpc, in which we account for non-linear effects, shot noise and finite sample size. The aim is to constrain the beta=Omega^0.6/b parameter and the power spectrum of density fluctuations. We obtain beta=0.70^+0.35_-0.2 at 1sigma confidence level. The likelihood analysis is not very sensitive to the shape of the power spectrum, because of the rise in the amplitude of the dipole beyond 40 h^-1 Mpc and the increase in shot noise on large scales. There is, however, a weak indication that within the

  8. Well Logging with Californium-252

    SciTech Connect

    Boulogne, A.R.

    2003-01-06

    Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.

  9. Logs Wanted - Dead or Alive

    NASA Astrophysics Data System (ADS)

    Schuchardt, A.; Morche, D.

    2015-12-01

    Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley

  10. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, Timothy S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The `standard' and `quick look' Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  11. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, T.S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The "standard" and "quick look" Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in all of the gas hydrate accumulations assessed in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  12. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.

    2013-12-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  13. A complete X-ray sample of the high-latitude /absolute value of b greater than 20 deg/ sky from HEAO 1 A-2 - Log N-log S and luminosity functions

    NASA Technical Reports Server (NTRS)

    Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.

    1982-01-01

    An all-sky survey of X-ray sources was performed, complete to a limiting sensitivity of 3.1 x 10 to the -11 ergs/sq cm/s in the 2-10 keV band. The complete sample has allowed construction of luminosity functions based on a flux-limited sample for clusters of galaxies and active galactic nuclei. Integration of the best-fit luminosity functions indicates that clusters of galaxies contribute about 4% of the 2-10 keV DXRB, and active galactic nuclei about 20%. It is predicted that many of the objects seen in the deep survey should be local, relatively low luminosity active galactic nuclei and clusters of galaxies.

  14. [Using log-binomial model for estimating the prevalence ratio].

    PubMed

    Ye, Rong; Gao, Yan-hui; Yang, Yi; Chen, Yue

    2010-05-01

    To estimate the prevalence ratios, using a log-binomial model with or without continuous covariates. Prevalence ratios for individuals' attitude towards smoking-ban legislation associated with smoking status, estimated by using a log-binomial model were compared with odds ratios estimated by logistic regression model. In the log-binomial modeling, maximum likelihood method was used when there were no continuous covariates and COPY approach was used if the model did not converge, for example due to the existence of continuous covariates. We examined the association between individuals' attitude towards smoking-ban legislation and smoking status in men and women. Prevalence ratio and odds ratio estimation provided similar results for the association in women since smoking was not common. In men however, the odds ratio estimates were markedly larger than the prevalence ratios due to a higher prevalence of outcome. The log-binomial model did not converge when age was included as a continuous covariate and COPY method was used to deal with the situation. All analysis was performed by SAS. Prevalence ratio seemed to better measure the association than odds ratio when prevalence is high. SAS programs were provided to calculate the prevalence ratios with or without continuous covariates in the log-binomial regression analysis.

  15. Likelihood Principle and Maximum Likelihood Estimator of Location Parameter for Cauchy Distribution.

    DTIC Science & Technology

    1986-05-01

    consistency (or strong consistency) of maximum likelihood estimator has been studied by many researchers, for example, Wald (1949), Wolfowitz (1953, 1965...20, 595-601. [25] Wolfowitz , J. (1953). The method of maximum likelihood and Wald theory of decision functions. Indag. Math., Vol. 15, 114-119. [26...Probability Letters Vol. 1, No. 3, 197-202. [24] Wald , A. (1949). Note on the consistency of maximum likelihood estimates. Ann. Math. Statist., Vol

  16. A Strategy for Interpretation of Microearthquake Tomography Results in the Salton Sea Geothermal Field Based upon Rock Physics Interpretations of State 2-14 Borehole Logs

    SciTech Connect

    Bonner, B; Hutchings, L; Kasameyer, P

    2006-06-14

    We devise a strategy for analysis of Vp and Vs microearthquake tomography results in the Salton Sea geothermal field to identify important features of the geothermal reservoir. We first interpret rock properties in State 2-14 borehole based upon logged core through the reservoir. Then, we interpret seismic recordings in the well (Daley et al., 1988) to develop the strategy. We hypothesize that mapping Poisson's ratio has two applications for the Salton Sea geothermal reservoir: (1) to map the top of the reservoir, and (2) as a diagnostic for permeable zones. Poisson's ratio can be obtained from Vp and Vs. In the State 2-14 borehole, Poisson's ratio calculated from large scale averages ({approx} 150 m) shows a monotonic decrease with depth to about 1300 m, at which point it increases with depth. Our model is that the monotonic decrease is due to compaction, and the increase below 1300 m is due to the rocks being hydrothermally altered. We hypothesize we can map the depth to alteration by identifying the transition from decreasing to increasing values; and thus, map the top of the reservoir, which is associated with a known increase in sulfite, chlorite, and epidote alteration that may be indicative of hydrothermal activity. We also observe (from Daley et. al. plots) an anomalous drop in Poisson's ratio at a depth of about 900 m, within a sandstone formation. The sandstone has a P-wave velocity significantly higher than the siltstone above it but a lower velocity in the lower half of the formation relative to the upper half. We interpret the relative decrease in velocity to be due to fracturing and chemical alteration caused by permeability. We conclude that using Vp and Vs tomography results to obtain images of Poisson's ratio has the potential to identify significant features in the geothermal reservoir in this geologic setting. Seismic attenuation tomography results (mapped as Qp and Qs) should also be useful for evaluating geothermal reservoirs, but that is not

  17. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  18. Comparison of Ethanedinitrile (C2N2) and Metam Sodium for Control of Bursaphelenchus xylophilus (Nematoda: Aphelenchidae) and Monochamus alternatus (Coleoptera: Cerambycidae) in Naturally Infested Logs at Low Temperatures.

    PubMed

    Park, Chung Gyoo; Son, Joon-Ki; Lee, Byung-Ho; Cho, Jin Hoon; Ren, Yonglin

    2014-12-01

    The Bursaphelenchus xylophilus, commonly known as pinewood nematode in Japan, is a quarantine pest and is most often associated with beetles of the genus Monochamus, the pine sawyers, particularly Monochamus alternatus. Long-distance dispersal of the nematode and its vectors led to widespread losses in pine forests. Two fumigation trials were conducted for treatment of logs naturally infested with both M. alternatus and B. xylophilus. The logs were treated with ethanedinitrile or metam sodium at low temperature (-7-25.7°C and -3.7-23.1°C) for 3-d exposure in winter and early spring. Fumigation with ethanedinitrile at concentrations of 48, 68, 97 and 158 g/m(3) resulted in 34.6-58.3, 91.5-97.2, 100, and 100% mortality for M. alternatus and 88.4, 77.9, 96.4, and 98.0% mortality for B. xylophilus, respectively. With Metam sodium fumigation at a dose rate of 1162 g/m(3), 100% M. alternatus and 97.4% B. xylophilus were killed. These results suggest that 97 g/m(3) of ethanedinitrile is adequate for complete control of M. alternatus in pine wood and >158 g/m(3) is required for eradication of B. xylophilus at low temperature fumigation. These results suggest that 97 g/m(3) of ethanedinitrile offers complete control of M. alternatus in pine wood and control of >98% B. xylophilus in winter or spring fumigation at a dosage rate of 158 g/m(3). Therefore, ethanedinitrile has great potential for treatment of fresh pine wooden logs to manage the nematodes and the vector insects at low temperature.

  19. Likelihood Analysis of Supersymmetric SU(5) GUTs

    SciTech Connect

    Bagnaschi, E.; Costa, J. C.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M. J.; Ellis, J. R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K. A.; Richards, A.; de Vries, K. J.; Weiglein, G.

    2016-10-31

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel ${\\tilde u_R}/{\\tilde c_R} - \\tilde{\\chi}^0_1$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ${\\tilde \

  20. Likelihood analysis of supersymmetric SU(5) GUTs

    NASA Astrophysics Data System (ADS)

    Bagnaschi, E.; Costa, J. C.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M. J.; Ellis, J. R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K. A.; Richards, A.; de Vries, K. J.; Weiglein, G.

    2017-02-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m_{1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m_5 and m_{10}, and for the 5 and {bar{5}} Higgs representations m_{H_u} and m_{H_d}, a universal trilinear soft SUSY-breaking parameter A_0, and the ratio of Higgs vevs tan β . In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + [InlineEquation not available: see fulltext.] events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel {tilde{u}_R}/{tilde{c}_R} - tilde{χ }01 coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of {{tilde{ν }}_τ } coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  1. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE)

    PubMed Central

    Boker, Steven M.; Brick, Timothy R.; Pritikin, Joshua N.; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D.; Maes, Hermine H.; Neale, Michael C.

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly-impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participants’ personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual’s data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies. PMID:26717128

  2. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE).

    PubMed

    Boker, Steven M; Brick, Timothy R; Pritikin, Joshua N; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D; Maes, Hermine H; Neale, Michael C

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participant's personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual's data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies.

  3. Log Polar Quantization.

    DTIC Science & Technology

    1979-10-01

    OR SKIP1: RTS PC ;RETURN ;**** ERROR SECTION ERR: JMp e*4 ; ABORT TO MONITOR *END 69 AFAL-TR-79-1075 *TITLE INA NORMALIZE AMPLITUDE 26-AUG-77 COKER...IIVI ;* ERROR SECTION ERR: JP @#4 ; ABORT TO MONITOR ;**** M2 M2: COM R2 ;COMPLEMENT I2VI BIC #377 ,R2 ;CLEAR PHASE BITS ADD *400 ,R2 ;-I2VI JMP ADD...IlVI Ml ? CMP #0 PR3 OILF=0? BEG M2 ;IF SO -12VI CMP $1 PR3 ;ILF=1? BEG ml ;IF SO -IlVI ;**ERROR SECTION ERR: Jmp @#4 ; ABORT TO MONITOR ;**** M2 M2: CON

  4. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...

  5. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...

  6. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...

  7. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...

  8. The fluid-compensated cement bond log

    SciTech Connect

    Nayfeh, T.H.; Wheelis, W.B. Jr.; Leslie, H.D.

    1986-08-01

    Simulations of cement bond logging (CBL) have shown that wellbore fluid effects can be segregated from sonic-signal response to changing cement strengths. Traditionally, the effects have been considered negligible and the CBL's have been interpreted as if water were in the wellbore. However, large variations in CBL's have become apparent with the increasing number of logs run in completion fluids, such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/. To study wellbore fluid effects, physical and numerical models were developed that simulated the wellbore geometry. Measurements were conducted in 5-, 7-, and 9 5/8-in. casings for a range of wellbore fluid types and for both densities and viscosities. Parallel numerical modeling used similar parameters. Results show that bond-log amplitudes varied dramatically with the wellbore fluid acoustic impedance-i.e., there was a 70% increase in signal amplitudes for 11.5 lbm/gal (1370-kg/m/sup 3/) CaCl/sub 2/ over the signal amplitude in water. This led to the development of a fluid-compensated bond log that corrects the amplitude for acoustic impedance of various wellbore fluids, thereby making the measurements more directly related to the cement quality.

  9. Likelihood analysis of earthquake focal mechanism distributions

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2015-06-01

    In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad hoc, empirical assumptions, thus their performance is questionable. We apply a conventional likelihood method to measure the skill of earthquake focal mechanism orientation forecasts. The advantage of such an approach is that earthquake rate prediction can be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. We measure the difference between two double-couple sources as the minimum rotation angle that transforms one into the other. We measure the uncertainty of a focal mechanism forecast (the variability), and the difference between observed and forecasted orientations (the prediction error), in terms of these minimum rotation angles. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random (or equally probable). For 3-D rotation the random rotation angle distribution is not uniform. To better understand the resulting complexities, we calculate the information (likelihood) score for two theoretical rotational distributions (Cauchy and von Mises-Fisher), which are used to approximate earthquake source orientation pattern. We then calculate the likelihood score for earthquake source forecasts and for their validation by future seismicity data. Several issues need to be explored when analyzing observational results: their dependence on forecast and data resolution, internal dependence of scores on forecasted angle and random variability of likelihood scores. Here, we propose a simple tentative solution but extensive theoretical and statistical analysis is needed.

  10. Factors Influencing Likelihood of Voice Therapy Attendance.

    PubMed

    Misono, Stephanie; Marmor, Schelomo; Roy, Nelson; Mau, Ted; Cohen, Seth M

    2017-03-01

    Objective To identify factors associated with the likelihood of attending voice therapy among patients referred for it in the CHEER (Creating Healthcare Excellence through Education and Research) practice-based research network infrastructure. Study Design Prospectively enrolled cross-sectional study. Setting CHEER network of community and academic sites. Methods Data were collected on patient-reported demographics, voice-related diagnoses, voice-related handicap (Voice Handicap Index-10), likelihood of attending voice therapy (VT), and opinions on factors influencing likelihood of attending VT. The relationships between patient characteristics/opinions and likelihood of attending VT were investigated. Results A total of 170 patients with various voice-related diagnoses reported receiving a recommendation for VT. Of those, 85% indicated that they were likely to attend it, regardless of voice-related handicap severity. The most common factors influencing likelihood of VT attendance were insurance/copay, relief that it was not cancer, and travel. Those who were not likely to attend VT identified, as important factors, unclear potential improvement, not understanding the purpose of therapy, and concern that it would be too hard. In multivariate analysis, factors associated with greater likelihood of attending VT included shorter travel distance, age (40-59 years), and being seen in an academic practice. Conclusions Most patients reported plans to attend VT as recommended. Patients who intended to attend VT reported different considerations in their decision making from those who did not plan to attend. These findings may inform patient counseling and efforts to increase access to voice care.

  11. New materials for fireplace logs

    NASA Technical Reports Server (NTRS)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  12. Numerical likelihood analysis of cosmic ray anisotropies

    SciTech Connect

    Carlos Hojvat et al.

    2003-07-02

    A numerical likelihood approach to the determination of cosmic ray anisotropies is presented which offers many advantages over other approaches. It allows a wide range of statistically meaningful hypotheses to be compared even when full sky coverage is unavailable, can be readily extended in order to include measurement errors, and makes maximum unbiased use of all available information.

  13. Growing local likelihood network: Emergence of communities

    NASA Astrophysics Data System (ADS)

    Chen, S.; Small, M.

    2015-10-01

    In many real situations, networks grow only via local interactions. New nodes are added to the growing network with information only pertaining to a small subset of existing nodes. Multilevel marketing, social networks, and disease models can all be depicted as growing networks based on local (network path-length) distance information. In these examples, all nodes whose distance from a chosen center is less than d form a subgraph. Hence, we grow networks with information only from these subgraphs. Moreover, we use a likelihood-based method, where at each step we modify the networks by changing their likelihood to be closer to the expected degree distribution. Combining the local information and the likelihood method, we grow networks that exhibit novel features. We discover that the likelihood method, over certain parameter ranges, can generate networks with highly modulated communities, even when global information is not available. Communities and clusters are abundant in real-life networks, and the method proposed here provides a natural mechanism for the emergence of communities in scale-free networks. In addition, the algorithmic implementation of network growth via local information is substantially faster than global methods and allows for the exploration of much larger networks.

  14. Synthesizing Regression Results: A Factored Likelihood Method

    ERIC Educational Resources Information Center

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  15. Assessing the Likelihood of Hand-to-Hand Cross-Transmission of Bacteria: An Experimental Study.

    PubMed

    Bellissimo-Rodrigues, Fernando; Pires, Daniela; Soule, Hervé; Gayet-Ageron, Angèle; Pittet, Didier

    2017-02-22

    BACKGROUND Although the hands of healthcare workers (HCWs) are implicated in most episodes of healthcare-associated infections, the correlation between hand contamination and the likelihood of cross-transmission remains unknown. METHODS We conducted a laboratory-based study involving pairs of HCWs. The hands of a HCW (transmitter) were contaminated with Escherichia coli ATCC 10536 before holding hands with another HCW (host) for 1 minute. Meanwhile, the unheld hand of the transmitter was sampled. Afterward, the host's held hand was also sampled. Each experiment consisted of 4 trials with increasing concentrations of E. coli (103-106 colony-forming units [cfu]/mL). The primary outcome was the likelihood of transmission of at least 1 cfu from transmitter to host. We used a mixed logistic regression model with a random effect on the subject to assess the association between transmission and bacterial count on the transmitter's hands. RESULTS In total, 6 HCWs performed 30 experiments and 120 trials. The bacterial counts recovered from host hands were directly associated with the bacterial counts on transmitter hands (P1 and ≤3 log10 cfu compared to ≤1 log10. When transmitter contamination was <1 log10 cfu, no cross-transmission was detected. CONCLUSION There is a direct relationship between the bacterial burden on HCWs hands and the likelihood of cross-transmission. Under the described conditions, at least 1 log10 cfu must be present on HCW hands to be potentially transmitted. Further studies are needed at the low contamination range. Infect Control Hosp Epidemiol 2017;1-6.

  16. Predicting reservoir wettability via well logs

    NASA Astrophysics Data System (ADS)

    Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang

    2016-06-01

    Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.

  17. Vacuum-soaking of wood chip shiitake (Lentinula edodes) logs to reduce soak time and log weight variability and to stimulate mushroom yield.

    PubMed

    Royse, D J; Rhodes, T W; Sanchez, J E

    2002-01-01

    Synthetic logs were vacuum-soaked or regular-soaked to determine the effects of soaking on yield and mushroom size, log weight variability and water distribution within the log. Yields (g/log) from substrates vacuum-soaked were higher by 26.7%, 18.6% and 35.8% (mean = 27.2%) for crops I, II and III, respectively, when compared with regular-soaked. However, mushroom size averaged only 11.2 g for vacuum-soaked logs vs 17 g for regular-soaked logs (51.8% larger for regular-soaked). The time required for vacuum-soaking logs was generally less than 3 min, compared with regular-soaking times ranging over 3-15 h. Water tended to accumulate more in the outside zone in the vacuum-soaked logs, compared with regular-soaked logs. Mean moisture contents for crops I and II for outside, middle and interior zones of vacuum-soaked logs were 66%, 47.5% and 42.2%, respectively, while regular-soaked logs for the same zones were 62.4%, 52.1% and 50.9%, respectively. Vacuum-soaked log weights had lower standard deviations than weights for regular-soaked logs in four out of six soaks, indicating a more uniform soaking process.

  18. Well Logging and Logging Analysis of UHP metamorphic Rocks in CCSD Main Hole (0-2000m)

    NASA Astrophysics Data System (ADS)

    Pan, H.; Niu, Y.; Wang, W.; Zhu, L.; Xu, D.; Wu, H.; Li, S.; Luo, M.

    2004-12-01

    CCSD logging engineering gather many modern high technologies and employs various advanced logging tools to survey the sidewall continuously. This can obtain various physical, chemical, geometrical, etc in-situ information of the borehole's profile. So well logging is one of the most important parts and pivotal technologies in the project of CCSD. The main logging methods in CCSD-MH(0-2000m) are laterolog (Rd,Rs), gamma ray(GR), nature gamma spectrometry(U, TH, K), density(DEN), photo electric section exponent (Pe), compensated neutron(CNL), multipole array acoustic (Vp, Vs, Vst), Simultaneous Acoustic-Resistivity-image(Star-II), temperature(T),magnetic susceptibility(MS), three component borehole magnetic and redox potential log,etc. The various metamorphic rocks can be classified by logging curves,and their physical parameters can be acquired by analyzing the response characters of various metamorphic rocks and by statistics. According to the logging cross plot, We can research the clustering of metamorphite's physical property. Five lithologic segments can be obtainend by logging curves. The GR, Th, U, K logging values of segment 1 is lower than the third, fourth and fiveth segment, higher than segment 2; The DEN, Pe values of segment 1 higher than the third, fourth and fiveth segments. The main rocks in segment 1,2,3,4,5 are eclogites, serpentinites, paragneiss, orthogneiss, and eclogites(containing silicon and muscovite ) respectively. Generally, eclogite contain rutile, silicon, muscovite, etc. minerals. These minerals have response obviously on log curves.There are rutile,ilmenite, pyrite mineralized, etc. Making use of DEN, Pe, susceptibility log values, these mineralized layers can be goodly demarcation. For example, on the rutile mineralzed layer, the logging curve response characters are of high density and Pe obviously. The key data of the synthetical seismic record is wave impedance. In this paper, Utilize the data of AC, DEN curves to calculate the

  19. Asymptotic variance of flood quantile in log Pearson Type III distribution with historical information

    NASA Astrophysics Data System (ADS)

    Pilon, Paul J.; Adamowski, Kaz

    1993-03-01

    Maximum likelihood and censored sample theory are applied for flood frequency analysis purposes to the log Pearson Type III (LP3) distribution. The logarithmic likelihood functions are developed and solved in terms of fully specified floods, historical information, and parameters to be estimated. The asymptotic standard error of estimate of the T-year flood is obtained using the general equation for the variance of estimate of a function. The variances and covariances of the parameters are obtained through inversion of Fisher's information matrix. Monte Carlo studies to verify the accuracy of the derived asymptotic expression for the standard errors of the 10, 50, 100, and 500 year floods, indicate that these are accurate for both Type I and Type II censored samples, while the bias is less than 2.5%. Subsequently, the Type II censored data were subjected to a random, multiplicative error. Results indicate that historical information contributes greatly to the accuracy of estimation of the quantiles even when the error of its measurement becomes excessive.

  20. Log of Trench 04A Across the Hayward Fault at Tyson's Lagoon (Tule Pond), Fremont, Alameda County, California

    USGS Publications Warehouse

    Lienkaemper, James J.; Williams, Patrick L.; Sickler, Robert R.; Fumal, Thomas E.

    2005-01-01

    This publication makes available a detailed trench log (sheets 1 and 2) of a 110-m trench we excavated in 2004 across a tectonic sag pond in the Hayward fault zone. Also included are revised stratigraphic unit descriptions from this fifth field season of subsurface investigation of the Hayward fault at Tyson's Lagoon (Tule Pond). Preliminary findings based on fieldwork done in 2000 have been published (Lienkaemper and others: data archive, 2002a; report, 2002b), as were the logs and data for 2001-2003 (Lienkaemper and others, 2003, L03). A continuous exposure of the geologic section across the entire pond made in 2004 (Fig. 1, 04A) has revealed some critical miscorrelations of units made in the original on-line version of L03, hence users of these earlier trench data should only use the 2005 revised version 2.0 of L03 for correlation purposes. Lienkaemper, Williams, and Sickler interpreted the geology and logged the trenches. Fumal did most of the trench photography. The Hayward fault is recognized to be among the most hazardous in the United States (Working Group on California Earthquake Probabilities, 2003). Establishing a chronology of prehistoric or paleoearthquakes is of immediate use in resolving the likelihood of future large earthquakes Hayward fault. This document makes available geologic evidence for historical and prehistoric surface-rupturing earthquakes preserved at the site. A second, formal report on our conclusions based on these data is in preparation.

  1. Maximum Likelihood DOA Estimation of Multiple Wideband Sources in the Presence of Nonuniform Sensor Noise

    NASA Astrophysics Data System (ADS)

    Chen, C. E.; Lorenzelli, F.; Hudson, R. E.; Yao, K.

    2007-12-01

    We investigate the maximum likelihood (ML) direction-of-arrival (DOA) estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB) has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML) requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML) attain a solution close to the derived CRB at high signal-to-noise ratio.

  2. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2013-09-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.

  3. Seasonal logging, process response, and geomorphic work

    NASA Astrophysics Data System (ADS)

    Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.

    2014-03-01

    Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.

  4. Stepwise Signal Extraction via Marginal Likelihood

    PubMed Central

    Du, Chao; Kao, Chu-Lan Michael

    2015-01-01

    This paper studies the estimation of stepwise signal. To determine the number and locations of change-points of the stepwise signal, we formulate a maximum marginal likelihood estimator, which can be computed with a quadratic cost using dynamic programming. We carry out extensive investigation on the choice of the prior distribution and study the asymptotic properties of the maximum marginal likelihood estimator. We propose to treat each possible set of change-points equally and adopt an empirical Bayes approach to specify the prior distribution of segment parameters. Detailed simulation study is performed to compare the effectiveness of this method with other existing methods. We demonstrate our method on single-molecule enzyme reaction data and on DNA array CGH data. Our study shows that this method is applicable to a wide range of models and offers appealing results in practice. PMID:27212739

  5. cosmoabc: Likelihood-free inference for cosmology

    NASA Astrophysics Data System (ADS)

    Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.

    2015-05-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.

  6. Spectral likelihood expansions for Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nagel, Joseph B.; Sudret, Bruno

    2016-03-01

    A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.

  7. Method for induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1987-04-14

    A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.

  8. Likelihood-Based Climate Model Evaluation

    NASA Technical Reports Server (NTRS)

    Braverman, Amy; Cressie, Noel; Teixeira, Joao

    2012-01-01

    Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.

  9. Likelihood-based modification of experimental crystal structure electron density maps

    DOEpatents

    Terwilliger, Thomas C.

    2005-04-16

    A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.

  10. Maximum likelihood continuity mapping for fraud detection

    SciTech Connect

    Hogden, J.

    1997-05-01

    The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.

  11. Likelihood reinstates Archaeopteryx as a primitive bird.

    PubMed

    Lee, Michael S Y; Worthy, Trevor H

    2012-04-23

    The widespread view that Archaeopteryx was a primitive (basal) bird has been recently challenged by a comprehensive phylogenetic analysis that placed Archaeopteryx with deinonychosaurian theropods. The new phylogeny suggested that typical bird flight (powered by the front limbs only) either evolved at least twice, or was lost/modified in some deinonychosaurs. However, this parsimony-based result was acknowledged to be weakly supported. Maximum-likelihood and related Bayesian methods applied to the same dataset yield a different and more orthodox result: Archaeopteryx is restored as a basal bird with bootstrap frequency of 73 per cent and posterior probability of 1. These results are consistent with a single origin of typical (forelimb-powered) bird flight. The Archaeopteryx-deinonychosaur clade retrieved by parsimony is supported by more characters (which are on average more homoplasious), whereas the Archaeopteryx-bird clade retrieved by likelihood-based methods is supported by fewer characters (but on average less homoplasious). Both positions for Archaeopteryx remain plausible, highlighting the hazy boundary between birds and advanced theropods. These results also suggest that likelihood-based methods (in addition to parsimony) can be useful in morphological phylogenetics.

  12. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  13. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  14. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  15. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  16. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  17. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  18. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  19. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  20. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  1. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  2. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  3. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  4. Downhole memory-logging tools

    SciTech Connect

    Lysne, P.

    1992-01-01

    Logging technologies developed hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (i) there is a general lack of vetted, high-temperature instrumentation, and (ii) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions.'' Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of potassium, uranium and thorium will be available in about on year, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A second tool designed to sample fluids at conditions exceeding 400{degrees}C is in the proposal stage. Partnerships are being formed between the geothermal industry, scientific drilling programs, and the national laboratories to define and develop inversion algorithms relating raw tool data to more pertinent information. 8 refs.

  5. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  6. Maximal likelihood correspondence estimation for face recognition across pose.

    PubMed

    Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang

    2014-10-01

    Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database.

  7. Likelihood analysis of spatial capture-recapture models for stratified or class structured populations

    USGS Publications Warehouse

    Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.

    2015-01-01

    We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.

  8. Precision pressure/temperature logging tool

    SciTech Connect

    Henfling, J.A.; Normann, R.A.

    1998-01-01

    Past memory logging tools have provided excellent pressure/temperature data when used in a geothermal environment, and they are easier to maintain and deploy than tools requiring an electric wireline connection to the surface. However, they are deficient since the tool operator is unaware of downhole conditions that could require changes in the logging program. Tools that make ``decisions`` based on preprogrammed scenarios can partially overcome this difficulty, and a suite of such memory tools has been developed at Sandia National Laboratories. The first tool, which forms the basis for future instruments, measures pressure and temperature. Design considerations include a minimization of cost while insuring quality data, size compatibility with diamond-cored holes, operation in holes to 425 C (800 F), transportability by ordinary passenger air service, and ease of operation. This report documents the development and construction of the pressure/temperature tool. It includes: (1) description of the major components; (2) calibration; (3) typical logging scenario; (4) tool data examples; and (5) conclusions. The mechanical and electrical drawings, along with the tool`s software, will be furnished upon request.

  9. Chemical logging of geothermal wells

    DOEpatents

    Allen, Charles A.; McAtee, Richard E.

    1981-01-01

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  10. Chemical logging of geothermal wells

    DOEpatents

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  11. Audit Log for Forensic Photography

    NASA Astrophysics Data System (ADS)

    Neville, Timothy; Sorell, Matthew

    We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

  12. LIKEDM: Likelihood calculator of dark matter detection

    NASA Astrophysics Data System (ADS)

    Huang, Xiaoyuan; Tsai, Yue-Lin Sming; Yuan, Qiang

    2017-04-01

    With the large progress in searches for dark matter (DM) particles with indirect and direct methods, we develop a numerical tool that enables fast calculations of the likelihoods of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), γ-rays from the Fermi space telescope, and underground direct detection experiments. The purpose of this tool - LIKEDM, likelihood calculator for dark matter detection - is to bridge the gap between a particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi γ-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints from indirect detection of DM with charged cosmic and gamma rays. Direct detection will be implemented in the next version. This manual describes the framework, usage, and related physics of the code.

  13. Maximum likelihood decoding of Reed Solomon Codes

    SciTech Connect

    Sudan, M.

    1996-12-31

    We present a randomized algorithm which takes as input n distinct points ((x{sub i}, y{sub i})){sup n}{sub i=1} from F x F (where F is a field) and integer parameters t and d and returns a list of all univariate polynomials f over F in the variable x of degree at most d which agree with the given set of points in at least t places (i.e., y{sub i} = f (x{sub i}) for at least t values of i), provided t = {Omega}({radical}nd). The running time is bounded by a polynomial in n. This immediately provides a maximum likelihood decoding algorithm for Reed Solomon Codes, which works in a setting with a larger number of errors than any previously known algorithm. To the best of our knowledge, this is the first efficient (i.e., polynomial time bounded) algorithm which provides some maximum likelihood decoding for any efficient (i.e., constant or even polynomial rate) code.

  14. Multiplicative earthquake likelihood models incorporating strain rates

    NASA Astrophysics Data System (ADS)

    Rhoades, D. A.; Christophersen, A.; Gerstenberger, M. C.

    2017-01-01

    SUMMARYWe examine the potential for strain-rate variables to improve long-term earthquake <span class="hlt">likelihood</span> models. We derive a set of multiplicative hybrid earthquake <span class="hlt">likelihood</span> models in which cell rates in a spatially uniform baseline model are scaled using combinations of covariates derived from earthquake catalogue data, fault data, and strain-rates for the New Zealand region. Three components of the strain rate estimated from GPS data over the period 1991-2011 are considered: the shear, rotational and dilatational strain rates. The hybrid model parameters are optimised for earthquakes of M 5 and greater over the period 1987-2006 and tested on earthquakes from the period 2012-2015, which is independent of the strain rate estimates. The shear strain rate is overall the most informative individual covariate, as indicated by Molchan error diagrams as well as multiplicative modelling. Most models including strain rates are significantly more informative than the best models excluding strain rates in both the fitting and testing period. A hybrid that combines the shear and dilatational strain rates with a smoothed seismicity covariate is the most informative model in the fitting period, and a simpler model without the dilatational strain rate is the most informative in the testing period. These results have implications for probabilistic seismic hazard analysis and can be used to improve the background model component of medium-term and short-term earthquake forecasting models.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=uncertainty&pg=7&id=EJ948897','ERIC'); return false;" href="http://eric.ed.gov/?q=uncertainty&pg=7&id=EJ948897"><span>Developmental Changes in Children's Understanding of Future <span class="hlt">Likelihood</span> and Uncertainty</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Lagattuta, Kristin Hansen; Sayfan, Liat</p> <p>2011-01-01</p> <p>Two measures assessed 4-10-year-olds' and adults' (N = 201) understanding of future <span class="hlt">likelihood</span> and uncertainty. In one task, participants sequenced sets of event pictures varying by one physical dimension according to increasing future <span class="hlt">likelihood</span>. In a separate task, participants rated characters' thoughts about the <span class="hlt">likelihood</span> of future events,…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=partition+AND+coefficient&pg=2&id=EJ363005','ERIC'); return false;" href="http://eric.ed.gov/?q=partition+AND+coefficient&pg=2&id=EJ363005"><span>Maximum <span class="hlt">Likelihood</span> Estimation of Multivariate Polyserial and Polychoric Correlation Coefficients.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Poon, Wai-Yin; Lee, Sik-Yum</p> <p>1987-01-01</p> <p>Reparameterization is used to find the maximum <span class="hlt">likelihood</span> estimates of parameters in a multivariate model having some component variable observable only in polychotomous form. Maximum <span class="hlt">likelihood</span> estimates are found by a Fletcher Powell algorithm. In addition, the partition maximum <span class="hlt">likelihood</span> method is proposed and illustrated. (Author/GDC)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19960016406&hterms=uros&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Duros','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19960016406&hterms=uros&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Duros"><span>Maximum-<span class="hlt">likelihood</span> analysis of the COBE angular correlation function</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Seljak, Uros; Bertschinger, Edmund</p> <p>1993-01-01</p> <p>We have used maximum-<span class="hlt">likelihood</span> estimation to determine the quadrupole amplitude Q(sub rms-PS) and the spectral index n of the density fluctuation power spectrum at recombination from the COBE DMR data. We find a strong correlation between the two parameters of the form Q(sub rms-PS) = (15.7 +/- <span class="hlt">2</span>.6) exp (0.46(1 - n)) microK for fixed n. Our result is slightly smaller than and has a smaller statistical uncertainty than the 1992 estimate of Smoot et al.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=Newton&pg=4&id=EJ948212','ERIC'); return false;" href="http://eric.ed.gov/?q=Newton&pg=4&id=EJ948212"><span>Multimodal <span class="hlt">Likelihoods</span> in Educational Assessment: Will the Real Maximum <span class="hlt">Likelihood</span> Score Please Stand up?</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike</p> <p>2011-01-01</p> <p>It has been known for some time that item response theory (IRT) models may exhibit a <span class="hlt">likelihood</span> function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25994673','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25994673"><span>Avian responses to selective <span class="hlt">logging</span> shaped by species traits and <span class="hlt">logging</span> practices.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin</p> <p>2015-06-07</p> <p>Selective <span class="hlt">logging</span> is one of the most common forms of forest use in the tropics. Although the effects of selective <span class="hlt">logging</span> on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to <span class="hlt">logging</span>. In this study, we assessed how species traits and <span class="hlt">logging</span> practices combine to determine species responses to selective <span class="hlt">logging</span>, based on over 4000 observations of the responses of nearly 1000 bird species to selective <span class="hlt">logging</span> across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and <span class="hlt">logging</span> practices, such as time since <span class="hlt">logging</span> and <span class="hlt">logging</span> intensity, interact to influence a species' response to <span class="hlt">logging</span>. Frugivores and insectivores were most adversely affected by <span class="hlt">logging</span> and declined further with increasing <span class="hlt">logging</span> intensity. Nectarivores and granivores responded positively to selective <span class="hlt">logging</span> for the first two decades, after which their abundances decrease below pre-<span class="hlt">logging</span> levels. Larger species of omnivores and granivores responded more positively to selective <span class="hlt">logging</span> than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective <span class="hlt">logging</span> had not recovered approximately 40 years after <span class="hlt">logging</span> cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact <span class="hlt">logging</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4455798','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4455798"><span>Avian responses to selective <span class="hlt">logging</span> shaped by species traits and <span class="hlt">logging</span> practices</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S.; Koh, Lian Pin</p> <p>2015-01-01</p> <p>Selective <span class="hlt">logging</span> is one of the most common forms of forest use in the tropics. Although the effects of selective <span class="hlt">logging</span> on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to <span class="hlt">logging</span>. In this study, we assessed how species traits and <span class="hlt">logging</span> practices combine to determine species responses to selective <span class="hlt">logging</span>, based on over 4000 observations of the responses of nearly 1000 bird species to selective <span class="hlt">logging</span> across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and <span class="hlt">logging</span> practices, such as time since <span class="hlt">logging</span> and <span class="hlt">logging</span> intensity, interact to influence a species' response to <span class="hlt">logging</span>. Frugivores and insectivores were most adversely affected by <span class="hlt">logging</span> and declined further with increasing <span class="hlt">logging</span> intensity. Nectarivores and granivores responded positively to selective <span class="hlt">logging</span> for the first two decades, after which their abundances decrease below pre-<span class="hlt">logging</span> levels. Larger species of omnivores and granivores responded more positively to selective <span class="hlt">logging</span> than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective <span class="hlt">logging</span> had not recovered approximately 40 years after <span class="hlt">logging</span> cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact <span class="hlt">logging</span>. PMID:25994673</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_12 --> <div id="page_13" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="241"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol24/pdf/CFR-2012-title40-vol24-sec146-87.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol24/pdf/CFR-2012-title40-vol24-sec146-87.pdf"><span>40 CFR 146.87 - <span class="hlt">Logging</span>, sampling, and testing prior to injection well operation.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... installed; and (ii) A cement bond and variable density <span class="hlt">log</span> to evaluate cement quality radially, and a... cement bond and variable density <span class="hlt">log</span>, and a temperature <span class="hlt">log</span> after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (<span class="hlt">2</span>) Other physical and chemical characteristics of the...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol24/pdf/CFR-2013-title40-vol24-sec146-87.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol24/pdf/CFR-2013-title40-vol24-sec146-87.pdf"><span>40 CFR 146.87 - <span class="hlt">Logging</span>, sampling, and testing prior to injection well operation.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... installed; and (ii) A cement bond and variable density <span class="hlt">log</span> to evaluate cement quality radially, and a... cement bond and variable density <span class="hlt">log</span>, and a temperature <span class="hlt">log</span> after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (<span class="hlt">2</span>) Other physical and chemical characteristics of the...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title14-vol2/pdf/CFR-2011-title14-vol2-sec91-1443.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title14-vol2/pdf/CFR-2011-title14-vol2-sec91-1443.pdf"><span>14 CFR 91.1443 - CAMP: Airworthiness release or aircraft maintenance <span class="hlt">log</span> entry.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-01-01</p> <p>... maintenance <span class="hlt">log</span> entry. 91.1443 Section 91.1443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... aircraft maintenance <span class="hlt">log</span> entry. (a) No program aircraft maintained under a CAMP may be operated after... release; or (<span class="hlt">2</span>) An appropriate entry in the aircraft maintenance <span class="hlt">log</span>. (b) The airworthiness release or...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title14-vol2/pdf/CFR-2012-title14-vol2-sec91-1443.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title14-vol2/pdf/CFR-2012-title14-vol2-sec91-1443.pdf"><span>14 CFR 91.1443 - CAMP: Airworthiness release or aircraft maintenance <span class="hlt">log</span> entry.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-01-01</p> <p>... maintenance <span class="hlt">log</span> entry. 91.1443 Section 91.1443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... aircraft maintenance <span class="hlt">log</span> entry. (a) No program aircraft maintained under a CAMP may be operated after... release; or (<span class="hlt">2</span>) An appropriate entry in the aircraft maintenance <span class="hlt">log</span>. (b) The airworthiness release or...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title14-vol2/pdf/CFR-2013-title14-vol2-sec91-1443.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title14-vol2/pdf/CFR-2013-title14-vol2-sec91-1443.pdf"><span>14 CFR 91.1443 - CAMP: Airworthiness release or aircraft maintenance <span class="hlt">log</span> entry.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-01-01</p> <p>... maintenance <span class="hlt">log</span> entry. 91.1443 Section 91.1443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... aircraft maintenance <span class="hlt">log</span> entry. (a) No program aircraft maintained under a CAMP may be operated after... release; or (<span class="hlt">2</span>) An appropriate entry in the aircraft maintenance <span class="hlt">log</span>. (b) The airworthiness release or...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26263302','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26263302"><span>A Maximum-<span class="hlt">Likelihood</span> Approach to Force-Field Calibration.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam</p> <p>2015-09-28</p> <p>A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-<span class="hlt">likelihood</span> fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-<span class="hlt">likelihood</span> function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-<span class="hlt">likelihood</span> method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-<span class="hlt">likelihood</span>-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run <span class="hlt">2</span></p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19920017650','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19920017650"><span>Modelling default and <span class="hlt">likelihood</span> reasoning as probabilistic</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Buntine, Wray</p> <p>1990-01-01</p> <p>A probabilistic analysis of plausible reasoning about defaults and about <span class="hlt">likelihood</span> is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5271418','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5271418"><span>Maximum <span class="hlt">Likelihood</span> Reconstruction for Magnetic Resonance Fingerprinting</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.</p> <p>2017-01-01</p> <p>This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum <span class="hlt">likelihood</span> (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1997AJ....114..228M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1997AJ....114..228M"><span>Recovering Velocity Distributions Via Penalized <span class="hlt">Likelihood</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Merritt, David</p> <p>1997-07-01</p> <p>Line-of-sight velocity distributions are crucial for unravelling the dynamics of hot stellar systems. We present a new formalism based on penalized <span class="hlt">likelihood</span> for deriving such distributions from kinematical data, and evaluate the performance of two algorithms that extract N(V) from absorption-line spectra and from sets of individual velocities. Both algorithms are superior to existing ones in that the solutions are nearly unbiased even when the data are so poor that a great deal of smoothing is required. In addition, the discrete-velocity algorithm is able to remove a known distribution of measurement errors from the estimate of N(V). The formalism is used to recover the velocity distribution of stars in five fields near the center of the globular cluster omega Centauri.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19790012619','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19790012619"><span>Approximate maximum <span class="hlt">likelihood</span> decoding of block codes</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Greenberger, H. J.</p> <p>1979-01-01</p> <p>Approximate maximum <span class="hlt">likelihood</span> decoding algorithms, based upon selecting a small set of candidate code words with the aid of the estimated probability of error of each received symbol, can give performance close to optimum with a reasonable amount of computation. By combining the best features of various algorithms and taking care to perform each step as efficiently as possible, a decoding scheme was developed which can decode codes which have better performance than those presently in use and yet not require an unreasonable amount of computation. The discussion of the details and tradeoffs of presently known efficient optimum and near optimum decoding algorithms leads, naturally, to the one which embodies the best features of all of them.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhRvE..93d0101S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhRvE..93d0101S"><span>Groups, information theory, and Einstein's <span class="hlt">likelihood</span> principle</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sicuro, Gabriele; Tempesta, Piergiulio</p> <p>2016-04-01</p> <p>We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's <span class="hlt">likelihood</span> function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27176234','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27176234"><span>Groups, information theory, and Einstein's <span class="hlt">likelihood</span> principle.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sicuro, Gabriele; Tempesta, Piergiulio</p> <p>2016-04-01</p> <p>We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's <span class="hlt">likelihood</span> function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA230259','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA230259"><span>Cramer-Rao Bound, MUSIC, and Maximum <span class="hlt">Likelihood</span>. Effects of Temporal Phase Difference</span></a></p> <p><a target="_blank" href="https://publicaccess.dtic.mil/psm/api/service/search/search">DTIC Science & Technology</a></p> <p></p> <p>1990-11-01</p> <p>Technical Report 1373 November 1990 Cramer-Rao Bound, MUSIC , And Maximum <span class="hlt">Likelihood</span> Effects of Temporal Phase o Difference C. V. TranI OTIC Approved... MUSIC , and Maximum <span class="hlt">Likelihood</span> (ML) asymptotic variances corresponding to the two-source direction-of-arrival estimation where sources were modeled as...1pI = 1.00, SNR = 20 dB ..................................... 27 <span class="hlt">2</span>. MUSIC for two equipowered signals impinging on a 5-element ULA (a) IpI = 0.50, SNR</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110014555','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110014555"><span>Data Mining of Network <span class="hlt">Logs</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Collazo, Carlimar</p> <p>2011-01-01</p> <p>The statement of purpose is to analyze network monitoring <span class="hlt">logs</span> to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19760007982','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19760007982"><span>Balloon <span class="hlt">logging</span> with the inverted skyline</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mosher, C. F.</p> <p>1975-01-01</p> <p>There is a gap in aerial <span class="hlt">logging</span> techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial <span class="hlt">logging</span> will become effective and accepted in the <span class="hlt">logging</span> industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title10-vol1/pdf/CFR-2014-title10-vol1-sec34-71.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title10-vol1/pdf/CFR-2014-title10-vol1-sec34-71.pdf"><span>10 CFR 34.71 - Utilization <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-01-01</p> <p>... 10 Energy 1 2014-01-01 2014-01-01 false Utilization <span class="hlt">logs</span>. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization <span class="hlt">logs</span>. (a) Each licensee shall maintain utilization <span class="hlt">logs</span> showing for each sealed source the following information: (1) A description, including...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title10-vol1/pdf/CFR-2013-title10-vol1-sec34-71.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title10-vol1/pdf/CFR-2013-title10-vol1-sec34-71.pdf"><span>10 CFR 34.71 - Utilization <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-01-01</p> <p>... 10 Energy 1 2013-01-01 2013-01-01 false Utilization <span class="hlt">logs</span>. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization <span class="hlt">logs</span>. (a) Each licensee shall maintain utilization <span class="hlt">logs</span> showing for each sealed source the following information: (1) A description, including...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol7/pdf/CFR-2010-title29-vol7-sec1917-18.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol7/pdf/CFR-2010-title29-vol7-sec1917-18.pdf"><span>29 CFR 1917.18 - <span class="hlt">Log</span> handling.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 29 Labor 7 2010-07-01 2010-07-01 false <span class="hlt">Log</span> handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 <span class="hlt">Log</span> handling. (a) The employer shall ensure that structures (bunks) used to contain <span class="hlt">logs</span> have rounded corners and rounded structural parts to avoid...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title10-vol1/pdf/CFR-2011-title10-vol1-sec34-71.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title10-vol1/pdf/CFR-2011-title10-vol1-sec34-71.pdf"><span>10 CFR 34.71 - Utilization <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-01-01</p> <p>... 10 Energy 1 2011-01-01 2011-01-01 false Utilization <span class="hlt">logs</span>. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization <span class="hlt">logs</span>. (a) Each licensee shall maintain utilization <span class="hlt">logs</span> showing for each sealed source the following information: (1) A description, including...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title10-vol1/pdf/CFR-2010-title10-vol1-sec34-71.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title10-vol1/pdf/CFR-2010-title10-vol1-sec34-71.pdf"><span>10 CFR 34.71 - Utilization <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-01-01</p> <p>... 10 Energy 1 2010-01-01 2010-01-01 false Utilization <span class="hlt">logs</span>. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization <span class="hlt">logs</span>. (a) Each licensee shall maintain utilization <span class="hlt">logs</span> showing for each sealed source the following information: (1) A description, including...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title29-vol7/pdf/CFR-2012-title29-vol7-sec1917-18.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title29-vol7/pdf/CFR-2012-title29-vol7-sec1917-18.pdf"><span>29 CFR 1917.18 - <span class="hlt">Log</span> handling.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 29 Labor 7 2012-07-01 2012-07-01 false <span class="hlt">Log</span> handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 <span class="hlt">Log</span> handling. (a) The employer shall ensure that structures (bunks) used to contain <span class="hlt">logs</span> have rounded corners and rounded structural parts to avoid...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-1820.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-1820.pdf"><span>47 CFR 73.1820 - Station <span class="hlt">log</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station <span class="hlt">log</span>. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station <span class="hlt">log</span>. (a) Entries must be made in the station <span class="hlt">log</span> either manually by a person designated by the licensee who is in actual charge of...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol5/pdf/CFR-2013-title47-vol5-sec87-109.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol5/pdf/CFR-2013-title47-vol5-sec87-109.pdf"><span>47 CFR 87.109 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-10-01</p> <p>... 47 Telecommunication 5 2013-10-01 2013-10-01 false Station <span class="hlt">logs</span>. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station <span class="hlt">logs</span>. (a) A station at a fixed location in the international aeronautical mobile service must maintain a <span class="hlt">log</span> in accordance with Annex...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title10-vol1/pdf/CFR-2012-title10-vol1-sec34-71.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title10-vol1/pdf/CFR-2012-title10-vol1-sec34-71.pdf"><span>10 CFR 34.71 - Utilization <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-01-01</p> <p>... 10 Energy 1 2012-01-01 2012-01-01 false Utilization <span class="hlt">logs</span>. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization <span class="hlt">logs</span>. (a) Each licensee shall maintain utilization <span class="hlt">logs</span> showing for each sealed source the following information: (1) A description, including...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-1820.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-1820.pdf"><span>47 CFR 73.1820 - Station <span class="hlt">log</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... 47 Telecommunication 4 2014-10-01 2014-10-01 false Station <span class="hlt">log</span>. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station <span class="hlt">log</span>. (a) Entries must be made in the station <span class="hlt">log</span> either manually by a person designated by the licensee who is in actual charge of...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-1820.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-1820.pdf"><span>47 CFR 73.1820 - Station <span class="hlt">log</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-10-01</p> <p>... 47 Telecommunication 4 2013-10-01 2013-10-01 false Station <span class="hlt">log</span>. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station <span class="hlt">log</span>. (a) Entries must be made in the station <span class="hlt">log</span> either manually by a person designated by the licensee who is in actual charge of...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-1820.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-1820.pdf"><span>47 CFR 73.1820 - Station <span class="hlt">log</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... 47 Telecommunication 4 2012-10-01 2012-10-01 false Station <span class="hlt">log</span>. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station <span class="hlt">log</span>. (a) Entries must be made in the station <span class="hlt">log</span> either manually by a person designated by the licensee who is in actual charge of...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol5/pdf/CFR-2012-title47-vol5-sec87-109.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol5/pdf/CFR-2012-title47-vol5-sec87-109.pdf"><span>47 CFR 87.109 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... 47 Telecommunication 5 2012-10-01 2012-10-01 false Station <span class="hlt">logs</span>. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station <span class="hlt">logs</span>. (a) A station at a fixed location in the international aeronautical mobile service must maintain a <span class="hlt">log</span> in accordance with Annex...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-1820.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-1820.pdf"><span>47 CFR 73.1820 - Station <span class="hlt">log</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... 47 Telecommunication 4 2011-10-01 2011-10-01 false Station <span class="hlt">log</span>. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station <span class="hlt">log</span>. (a) Entries must be made in the station <span class="hlt">log</span> either manually by a person designated by the licensee who is in actual charge of...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol5/pdf/CFR-2010-title47-vol5-sec87-109.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol5/pdf/CFR-2010-title47-vol5-sec87-109.pdf"><span>47 CFR 87.109 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station <span class="hlt">logs</span>. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station <span class="hlt">logs</span>. (a) A station at a fixed location in the international aeronautical mobile service must maintain a <span class="hlt">log</span> in accordance with Annex...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title29-vol7/pdf/CFR-2013-title29-vol7-sec1917-18.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title29-vol7/pdf/CFR-2013-title29-vol7-sec1917-18.pdf"><span>29 CFR 1917.18 - <span class="hlt">Log</span> handling.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 29 Labor 7 2013-07-01 2013-07-01 false <span class="hlt">Log</span> handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 <span class="hlt">Log</span> handling. (a) The employer shall ensure that structures (bunks) used to contain <span class="hlt">logs</span> have rounded corners and rounded structural parts to avoid...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol5/pdf/CFR-2014-title47-vol5-sec87-109.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol5/pdf/CFR-2014-title47-vol5-sec87-109.pdf"><span>47 CFR 87.109 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... 47 Telecommunication 5 2014-10-01 2014-10-01 false Station <span class="hlt">logs</span>. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station <span class="hlt">logs</span>. (a) A station at a fixed location in the international aeronautical mobile service must maintain a <span class="hlt">log</span> in accordance with Annex...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title29-vol7/pdf/CFR-2011-title29-vol7-sec1917-18.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title29-vol7/pdf/CFR-2011-title29-vol7-sec1917-18.pdf"><span>29 CFR 1917.18 - <span class="hlt">Log</span> handling.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 29 Labor 7 2011-07-01 2011-07-01 false <span class="hlt">Log</span> handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 <span class="hlt">Log</span> handling. (a) The employer shall ensure that structures (bunks) used to contain <span class="hlt">logs</span> have rounded corners and rounded structural parts to avoid...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title29-vol7/pdf/CFR-2014-title29-vol7-sec1917-18.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title29-vol7/pdf/CFR-2014-title29-vol7-sec1917-18.pdf"><span>29 CFR 1917.18 - <span class="hlt">Log</span> handling.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 29 Labor 7 2014-07-01 2014-07-01 false <span class="hlt">Log</span> handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 <span class="hlt">Log</span> handling. (a) The employer shall ensure that structures (bunks) used to contain <span class="hlt">logs</span> have rounded corners and rounded structural parts to avoid...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol5/pdf/CFR-2011-title47-vol5-sec87-109.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol5/pdf/CFR-2011-title47-vol5-sec87-109.pdf"><span>47 CFR 87.109 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... 47 Telecommunication 5 2011-10-01 2011-10-01 false Station <span class="hlt">logs</span>. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station <span class="hlt">logs</span>. (a) A station at a fixed location in the international aeronautical mobile service must maintain a <span class="hlt">log</span> in accordance with Annex...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19960016980','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19960016980"><span>Effect of radiance-to-reflectance transformation and atmosphere removal on maximum <span class="hlt">likelihood</span> classification accuracy of high-dimensional remote sensing data</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Hoffbeck, Joseph P.; Landgrebe, David A.</p> <p>1994-01-01</p> <p>Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum <span class="hlt">likelihood</span> classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum <span class="hlt">likelihood</span> classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum <span class="hlt">likelihood</span> classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the <span class="hlt">log</span> residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum <span class="hlt">likelihood</span> classification accuracy. The average classification accuracy of the data transformed by ATREM and the <span class="hlt">log</span> residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum <span class="hlt">likelihood</span> classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum <span class="hlt">likelihood</span> classification accuracy.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1987A%26A...174..344N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1987A%26A...174..344N"><span>Automatic <span class="hlt">log</span> spectrum restoration of atmospheric seeing</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Navarro, R.; Gomez, R.; Santamaria, J.</p> <p>1987-03-01</p> <p>This paper presents an automatic method for (1) digital estimation of the width of the atmospherical seeing in astronomical images of extended objects and (<span class="hlt">2</span>) image restoration by using the constrained Jansson-Van Cittert deconvolution algorithm. The estimation of the seeing is achieved by computing the radial profile of the averaged <span class="hlt">log</span> spectrum of the image. The result of this estimation is then applied to compute the Point Spread Function (PSF) used in the deconvolution process. The method is applied to a photographical image of a sunspot. The quality of the restoration assesses the power and usefulness of the method.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23253368','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23253368"><span>Estimating the population density of the Asian tapir (Tapirus indicus) in a selectively <span class="hlt">logged</span> forest in Peninsular Malaysia.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rayan, D Mark; Mohamad, Shariff Wan; Dorward, Leejiah; Aziz, Sheema Abdul; Clements, Gopalasamy Reuben; Christopher, Wong Chai Thiam; Traeholt, Carl; Magintan, David</p> <p>2012-12-01</p> <p>The endangered Asian tapir (Tapirus indicus) is threatened by large-scale habitat loss, forest fragmentation and increased hunting pressure. Conservation planning for this species, however, is hampered by a severe paucity of information on its ecology and population status. We present the first Asian tapir population density estimate from a camera trapping study targeting tigers in a selectively <span class="hlt">logged</span> forest within Peninsular Malaysia using a spatially explicit capture-recapture maximum <span class="hlt">likelihood</span> based framework. With a trap effort of 2496 nights, 17 individuals were identified corresponding to a density (standard error) estimate of 9.49 (<span class="hlt">2</span>.55) adult tapirs/100 km(<span class="hlt">2</span>) . Although our results include several caveats, we believe that our density estimate still serves as an important baseline to facilitate the monitoring of tapir population trends in Peninsular Malaysia. Our study also highlights the potential of extracting vital ecological and population information for other cryptic individually identifiable animals from tiger-centric studies, especially with the use of a spatially explicit capture-recapture maximum <span class="hlt">likelihood</span> based framework.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/1248153','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/1248153"><span>EE-3A <span class="hlt">Logging</span> Report for Feb. 28th., 1994</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Anderson, David W.</p> <p>1994-03-14</p> <p>A Temperature/Casing-Collar Locator (CCL) <span class="hlt">log</span> of EE-3A was performed on Monday, February 28th, 1994. This <span class="hlt">log</span> was the first of two, (the other to be run on March 31st.), in an attempt to investigate a temperature anomaly occurring between 10,000 to 11,435 ft. The thermal recovery of this zone is slower than expected, which is believed to be caused by either; flow through the cemented zone of the 5 1/<span class="hlt">2</span>" liner, or an upward flow through the reservoir itself. Now that the first <span class="hlt">log</span> is over, the annulus of EE-3A will be shut-in for a period of approximately one month, at which time the <span class="hlt">log</span> will be repeated. If the temperature of this zone shows signs of recovery after the shut-in period, the integrity of the cement will be in question. No sign of recovery will point to upward reservoir flow theory.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004MSAIS...5..287T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004MSAIS...5..287T"><span>The <span class="hlt">logN-log</span>S relationship of normal X-ray emitting galaxies</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tajer, M.; Trinchieri, G.; Wolter A.; Campana, S.; Moretti, A.; Tagliaferri, G.</p> <p></p> <p>We have selected a flux limited serendipitous sample of galaxies from the cross-correlation of the BMW (Brera Multiscale Wavelet) ROSAT HRI and the LEDA (Lyon - Meudon Extragalactic Database) Catalogues. This sample is used to study the X-ray properties of normal galaxies in the local universe. We also find that the <span class="hlt">logN-log</span>S distribution we derived for a serendipitous subsample, optically and X-ray flux limited, is consistent with the euclidean slope in the flux range FX(0.5 - <span class="hlt">2</span>) ˜ 1.1 - 110 × 10-14 erg cm-<span class="hlt">2</span> s-1. We further show that the same law is valid over 4 decades, from the bright sample derived from the RASS data to the very faint detections in deep XMM-Newton fields.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=189293&keyword=bacillus+AND+tests&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50&CFID=77890528&CFTOKEN=16672306','EPA-EIMS'); return false;" href="http://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=189293&keyword=bacillus+AND+tests&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50&CFID=77890528&CFTOKEN=16672306"><span>A case study of Membrane Testing, Integrity and Estimations of <span class="hlt">Log</span> Removal Value per the Long Term <span class="hlt">2</span> Enhanced Surface Water Treatment Rule.</span></a></p> <p><a target="_blank" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>The SWTR-LT<span class="hlt">2</span> Rule addresses the use of membranes that may be used to achieve credits for the reduction of Cryptosporidium. The LT<span class="hlt">2</span> Rule requires product-specific testing of membranes, considered an alternative filtration technology by the rule. LT<span class="hlt">2</span> Rule allows the product speci...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28383503','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28383503"><span>Robust Multi-Frame Adaptive Optics Image Restoration Algorithm Using Maximum <span class="hlt">Likelihood</span> Estimation with Poisson Statistics.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan</p> <p>2017-04-06</p> <p>An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum <span class="hlt">likelihood</span> estimation. Our proposed algorithm uses a maximum <span class="hlt">likelihood</span> method with image regularization as the basic principle, and constructs the joint <span class="hlt">log</span> <span class="hlt">likelihood</span> function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25831448','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25831448"><span>The equivalence of information-theoretic and <span class="hlt">likelihood</span>-based methods for neural dimensionality reduction.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Williamson, Ross S; Sahani, Maneesh; Pillow, Jonathan W</p> <p>2015-04-01</p> <p>Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron's probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as "single-spike information" to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-<span class="hlt">likelihood</span> estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized <span class="hlt">log-likelihood</span> under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in <span class="hlt">likelihood</span>-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/22416000','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/22416000"><span>Accuracy of maximum <span class="hlt">likelihood</span> estimates of a two-state model in single-molecule FRET</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Gopich, Irina V.</p> <p>2015-01-21</p> <p>Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum <span class="hlt">likelihood</span> method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate <span class="hlt">likelihood</span> function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the <span class="hlt">likelihood</span> function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the <span class="hlt">log-log</span> scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007JOSAA..24..391B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007JOSAA..24..391B"><span>Maximum-<span class="hlt">likelihood</span> methods in wavefront sensing: stochastic models and <span class="hlt">likelihood</span> functions</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Barrett, Harrison H.; Dainty, Christopher; Lara, David</p> <p>2007-02-01</p> <p>Maximum-<span class="hlt">likelihood</span> (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for <span class="hlt">likelihoods</span> and Fisher information matrices are derived. The theory is illustrated by discussing Shack-Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack-Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/131801','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/131801"><span><span class="hlt">Logs</span> key to solving water production problems</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Wyatt, D.F. Jr.; Crook, R.J.</p> <p>1995-11-20</p> <p>Water source identification is the first and most important step in controlling unwanted water production that can severely limit the productive life of a well and, thereby, decrease hydrocarbon recovery. Water-control treatments often fail because the source of the water problem is not identified, the wrong treatment is performed, or the correct treatment is performed incorrectly. Table 1 lists typical problems, means of identification and evaluation, and chemical treatments available for correcting the problem. Well <span class="hlt">logs</span> can help diagnose downhole situations that can lead to unwanted water production, and the effectiveness of water-control treatments can be evaluated with cased and open hole <span class="hlt">logs</span>. The paper discusses cement bond <span class="hlt">logs</span> and the pulse echo tool for cement evaluation. Casing evaluation is carried out by mechanical caliper <span class="hlt">logs</span> and electro magnetic tools. Reservoir monitoring with pulsed neutron <span class="hlt">logs</span> and pulsed neutron spectrometry are discussed. Also discussed are production <span class="hlt">logging</span>, radioactive tracer <span class="hlt">logging</span>, and well tests.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012PhyEd..47..486.','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012PhyEd..47..486."><span>Reviews Equipment: Data logger Book: Imagined Worlds Equipment: Mini data loggers Equipment: PICAXE-18M<span class="hlt">2</span> data logger Books: Engineering: A Very Short Introduction and To Engineer Is Human Book: Soap, Science, & Flat-Screen TVs Equipment: u<span class="hlt">Log</span> and SensorLab Web Watch</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p></p> <p>2012-07-01</p> <p>WE RECOMMEND Data logger Fourier NOVA LINK: data <span class="hlt">logging</span> and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap u<span class="hlt">Log</span> sensors and sensor adapter A new addition to the <span class="hlt">Log</span>IT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus <span class="hlt">Log</span>IT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M<span class="hlt">2</span> data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23282143','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23282143"><span>Two-stage recovery of amphibian assemblages following selective <span class="hlt">logging</span> of tropical forests.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Adum, Gilbert Baase; Eichhorn, Markus Peter; Oduro, William; Ofori-Boateng, Caleb; Rödel, Mark-Oliver</p> <p>2013-04-01</p> <p>There is a lack of quantitative information on the effectiveness of selective-<span class="hlt">logging</span> practices in ameliorating effects of <span class="hlt">logging</span> on faunal communities. We conducted a large-scale replicated field study in 3 selectively <span class="hlt">logged</span> moist semideciduous forests in West Africa at varying times after timber extraction to assess post <span class="hlt">logging</span> effects on amphibian assemblages. Specifically, we assessed whether the diversity, abundance, and assemblage composition of amphibians changed over time for forest-dependent species and those tolerant of forest disturbance. In 2009, we sampled amphibians in 3 forests (total of 48 study plots, each <span class="hlt">2</span> ha) in southwestern Ghana. In each forest, we established plots in undisturbed forest, recently <span class="hlt">logged</span> forest, and forest <span class="hlt">logged</span> 10 and 20 years previously. <span class="hlt">Logging</span> intensity was constant across sites with 3 trees/ha removed. Recently <span class="hlt">logged</span> forests supported substantially more species than unlogged forests. This was due to an influx of disturbance-tolerant species after <span class="hlt">logging</span>. Simultaneously Simpson's index decreased, with increased in dominance of a few species. As time since <span class="hlt">logging</span> increased richness of disturbance-tolerant species decreased until 10 years after <span class="hlt">logging</span> when their composition was indistinguishable from unlogged forests. Simpson's index increased with time since <span class="hlt">logging</span> and was indistinguishable from unlogged forest 20 years after <span class="hlt">logging</span>. Forest specialists decreased after <span class="hlt">logging</span> and recovered slowly. However, after 20 years amphibian assemblages had returned to a state indistinguishable from that of undisturbed forest in both abundance and composition. These results demonstrate that even with low-intensity <span class="hlt">logging</span> (≤3 trees/ha) a minimum 20-year rotation of <span class="hlt">logging</span> is required for effective conservation of amphibian assemblages in moist semideciduous forests. Furthermore, remnant patches of intact forests retained in the landscape and the presence of permanent brooks may aid in the effective recovery of amphibian</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/22570206','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/22570206"><span>Dimension-independent <span class="hlt">likelihood</span>-informed MCMC</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Cui, Tiangang; Law, Kody J.H.; Marzouk, Youssef M.</p> <p>2016-01-01</p> <p>Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and <span class="hlt">likelihood</span>-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28260982','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28260982"><span><span class="hlt">Likelihood</span> analysis of supersymmetric SU(5) GUTs.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bagnaschi, E; Costa, J C; Sakurai, K; Borsato, M; Buchmueller, O; Cavanaugh, R; Chobanova, V; Citron, M; De Roeck, A; Dolan, M J; Ellis, J R; Flächer, H; Heinemeyer, S; Isidori, G; Lucio, M; Martínez Santos, D; Olive, K A; Richards, A; de Vries, K J; Weiglein, G</p> <p>2017-01-01</p> <p>We perform a <span class="hlt">likelihood</span> analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass [Formula: see text], distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), [Formula: see text] and [Formula: see text], and for the [Formula: see text] and [Formula: see text] Higgs representations [Formula: see text] and [Formula: see text], a universal trilinear soft SUSY-breaking parameter [Formula: see text], and the ratio of Higgs vevs [Formula: see text]. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + [Formula: see text] events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel [Formula: see text] coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of [Formula: see text] coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20370021','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20370021"><span>Physically constrained maximum <span class="hlt">likelihood</span> mode filtering.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Papp, Joseph C; Preisig, James C; Morozov, Andrey K</p> <p>2010-04-01</p> <p>Mode filtering is most commonly implemented using the sampled mode shapes or pseudoinverse algorithms. Buck et al. [J. Acoust. Soc. Am. 103, 1813-1824 (1998)] placed these techniques in the context of a broader maximum a posteriori (MAP) framework. However, the MAP algorithm requires that the signal and noise statistics be known a priori. Adaptive array processing algorithms are candidates for improving performance without the need for a priori signal and noise statistics. A variant of the physically constrained, maximum <span class="hlt">likelihood</span> (PCML) algorithm [A. L. Kraay and A. B. Baggeroer, IEEE Trans. Signal Process. 55, 4048-4063 (2007)] is developed for mode filtering that achieves the same performance as the MAP mode filter yet does not need a priori knowledge of the signal and noise statistics. The central innovation of this adaptive mode filter is that the received signal's sample covariance matrix, as estimated by the algorithm, is constrained to be that which can be physically realized given a modal propagation model and an appropriate noise model. Shallow water simulation results are presented showing the benefit of using the PCML method in adaptive mode filtering.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24357951','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24357951"><span>Reducing the <span class="hlt">likelihood</span> of long tennis matches.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Barnett, Tristan; Alan, Brown; Pollard, Graham</p> <p>2006-01-01</p> <p>Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the <span class="hlt">likelihood</span> of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19920048974&hterms=Chandler+Wobble&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DChandler%2BWobble','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19920048974&hterms=Chandler+Wobble&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DChandler%2BWobble"><span>Maximum <span class="hlt">likelihood</span> estimates of polar motion parameters</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wilson, Clark R.; Vicente, R. O.</p> <p>1990-01-01</p> <p>Two estimators developed by Jeffreys (1940, 1968) are described and used in conjunction with polar-motion data to determine the frequency (Fc) and quality factor (Qc) of the Chandler wobble. Data are taken from a monthly polar-motion series, satellite laser-ranging results, and optical astrometry and intercompared for use via interpolation techniques. Maximum <span class="hlt">likelihood</span> arguments were employed to develop the estimators, and the assumption that polar motion relates to a Gaussian random process is assessed in terms of the accuracies of the estimators. The present results agree with those from Jeffreys' earlier study but are inconsistent with the later estimator; a Monte Carlo evaluation of the estimators confirms that the 1968 method is more accurate. The later estimator method shows good performance because the Fourier coefficients derived from the data have signal/noise levels that are superior to those for an individual datum. The method is shown to be valuable for general spectral-analysis problems in which isolated peaks must be analyzed from noisy data.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/1324173','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/1324173"><span>Dimension-independent <span class="hlt">likelihood</span>-informed MCMC</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.</p> <p>2015-10-08</p> <p>Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and <span class="hlt">likelihood</span>-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1324173-dimension-independent-likelihood-informed-mcmc','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1324173-dimension-independent-likelihood-informed-mcmc"><span>Dimension-independent <span class="hlt">likelihood</span>-informed MCMC</span></a></p> <p><a target="_blank" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.</p> <p>2015-10-08</p> <p>Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian informationmore » and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and <span class="hlt">likelihood</span>-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.« less</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/484319','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/484319"><span>Disequilibrium mapping: Composite <span class="hlt">likelihood</span> for pairwise disequilibrium</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Devlin, B.; Roeder, K.; Risch, N.</p> <p>1996-08-15</p> <p>The pattern of linkage disequilibrium between a disease locus and a set of marker loci has been shown to be a useful tool for geneticists searching for disease genes. Several methods have been advanced to utilize the pairwise disequilibrium between the disease locus and each of a set of marker loci. However, none of the methods take into account the information from all pairs simultaneously while also modeling the variability in the disequilibrium values due to the evolutionary dynamics of the population. We propose a Composite <span class="hlt">Likelihood</span> CL model that has these features when the physical distances between the marker loci are known or can be approximated. In this instance, and assuming that there is a single disease mutation, the CL model depends on only three parameters, the recombination fraction between the disease locus and an arbitrary marker locus, {theta}, the age of the mutation, and a variance parameter. When the CL is maximized over a grid of {theta}, it provides a graph that can direct the search for the disease locus. We also show how the CL model can be generalized to account for multiple disease mutations. Evolutionary simulations demonstrate the power of the analyses, as well as their potential weaknesses. Finally, we analyze the data from two mapped diseases, cystic fibrosis and diastrophic dysplasia, finding that the CL method performs well in both cases. 28 refs., 6 figs., 4 tabs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007ApJ...663L..33W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007ApJ...663L..33W"><span>On the <span class="hlt">Likelihood</span> of Supernova Enrichment of Protoplanetary Disks</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Williams, Jonathan P.; Gaidos, Eric</p> <p>2007-07-01</p> <p>We estimate the <span class="hlt">likelihood</span> of direct injection of supernova ejecta into protoplanetary disks using a model in which the number of stars with disks decreases linearly with time, and clusters expand linearly with time such that their surface density is independent of stellar number. The similarity of disk dissipation and main-sequence lifetimes implies that the typical supernova progenitor is very massive, ~75-100 Msolar. Such massive stars are found only in clusters with >~104 members. Moreover, there is only a small region around a supernova within which disks can survive the blast yet be enriched to the level observed in the solar system. These two factors limit the overall <span class="hlt">likelihood</span> of supernova enrichment of a protoplanetary disk to <~1%. If the presence of short-lived radionucleides in meteorites is to be explained in this way, however, the solar system most likely formed in one of the largest clusters in the Galaxy, more than <span class="hlt">2</span> orders of magnitude greater than Orion, where multiple supernovae impacted many disks in a short period of time.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27941068','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27941068"><span>Targeted Maximum <span class="hlt">Likelihood</span> Estimation for Causal Inference in Observational Studies.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schuler, Megan S; Rose, Sherri</p> <p>2017-01-01</p> <p>Estimation of causal effects using observational data continues to grow in popularity in the epidemiologic literature. While many applications of causal effect estimation use propensity score methods or G-computation, targeted maximum <span class="hlt">likelihood</span> estimation (TMLE) is a well-established alternative method with desirable statistical properties. TMLE is a doubly robust maximum-<span class="hlt">likelihood</span>-based approach that includes a secondary "targeting" step that optimizes the bias-variance tradeoff for the target parameter. Under standard causal assumptions, estimates can be interpreted as causal effects. Because TMLE has not been as widely implemented in epidemiologic research, we aim to provide an accessible presentation of TMLE for applied researchers. We give step-by-step instructions for using TMLE to estimate the average treatment effect in the context of an observational study. We discuss conceptual similarities and differences between TMLE and <span class="hlt">2</span> common estimation approaches (G-computation and inverse probability weighting) and present findings on their relative performance using simulated data. Our simulation study compares methods under parametric regression misspecification; our results highlight TMLE's property of double robustness. Additionally, we discuss best practices for TMLE implementation, particularly the use of ensembled machine learning algorithms. Our simulation study demonstrates all methods using super learning, highlighting that incorporation of machine learning may outperform parametric regression in observational data settings.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014A%26A...571A..15P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014A%26A...571A..15P"><span>Planck 2013 results. XV. CMB power spectra and <span class="hlt">likelihood</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.</p> <p>2014-11-01</p> <p>This paper presents the Planck 2013 <span class="hlt">likelihood</span>, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this <span class="hlt">likelihood</span> to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering <span class="hlt">2</span> ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ < 50, our <span class="hlt">likelihood</span> exploits all Planck frequency channels from 30 to 353 GHz, separating the cosmological CMB signal from diffuse Galactic foregrounds through a physically motivated Bayesian component separation technique. At ℓ ≥ 50, we employ a correlated Gaussian <span class="hlt">likelihood</span> approximation based on a fine-grained set of angular cross-spectra derived from multiple detector combinations between the 100, 143, and 217 GHz frequency channels, marginalising over power spectrum foreground templates. We validate our <span class="hlt">likelihood</span> through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK<span class="hlt">2</span> at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/372578','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/biblio/372578"><span>Leak checker data <span class="hlt">logging</span> system</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Gannon, J.C.; Payne, J.J.</p> <p>1996-09-03</p> <p>A portable, high speed, computer-based data <span class="hlt">logging</span> system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/870587','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/servlets/purl/870587"><span>Leak checker data <span class="hlt">logging</span> system</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Gannon, Jeffrey C.; Payne, John J.</p> <p>1996-01-01</p> <p>A portable, high speed, computer-based data <span class="hlt">logging</span> system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006adf..book..247F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006adf..book..247F"><span>Analysis of Web Proxy <span class="hlt">Logs</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein</p> <p></p> <p>Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy <span class="hlt">logs</span> can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28182528','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28182528"><span><span class="hlt">Logged</span> In and Zoned Out.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ravizza, Susan M; Uitvlugt, Mitchell G; Fenn, Kimberly M</p> <p>2017-02-01</p> <p>Laptop computers are widely prevalent in university classrooms. Although laptops are a valuable tool, they offer access to a distracting temptation: the Internet. In the study reported here, we assessed the relationship between classroom performance and actual Internet usage for academic and nonacademic purposes. Students who were enrolled in an introductory psychology course <span class="hlt">logged</span> into a proxy server that monitored their online activity during class. Past research relied on self-report, but the current methodology objectively measured time, frequency, and browsing history of participants' Internet usage. In addition, we assessed whether intelligence, motivation, and interest in course material could account for the relationship between Internet use and performance. Our results showed that nonacademic Internet use was common among students who brought laptops to class and was inversely related to class performance. This relationship was upheld after we accounted for motivation, interest, and intelligence. Class-related Internet use was not associated with a benefit to classroom performance.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006AAS...20914302G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006AAS...20914302G"><span>Interactive, Collaborative, Electronic Learning <span class="hlt">Logs</span> in the Physics Classroom</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gosling, Chris</p> <p>2006-12-01</p> <p>I describe my experiences using Hickman's Interactive Collaborative Electronic Learning <span class="hlt">Logs</span> teaching HS Physics. ICE Learning <span class="hlt">Logs</span> are written in student groups to answer questions posed by the instructor, who then in turn responds to each group’s entry before the next class. These <span class="hlt">logs</span> were used with non-physics majors in both algebra and calculus-based introductory physics courses, and also at the high school level. I found ICE Learning <span class="hlt">Logs</span> were found to be a clear improvement over traditional student journals. Excerpts from group entries will be presented to demonstrate the group identities that formed as well as the utility of the journals to probe for conceptual understanding. In addition, the ICE Learning <span class="hlt">Logs</span> served as an excellent resource for students to review before exams and also to examine for critical moments to reflect on in formal essays. Hickman, P. (2000). Assessing student understanding with interactive-electronic-collaborative learning <span class="hlt">logs</span>. ENC Focus, 7(<span class="hlt">2</span>), 24-27. Sponsored by the National Science Foundation DUE0302097 and SUNY-Buffalo State Physics</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25029192','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25029192"><span>Four decades of forest persistence, clearance and <span class="hlt">logging</span> on Borneo.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gaveau, David L A; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik</p> <p>2014-01-01</p> <p>The native forests of Borneo have been impacted by selective <span class="hlt">logging</span>, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or <span class="hlt">logging</span> since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively <span class="hlt">logged</span> forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km<span class="hlt">2</span>) of Borneo's area (737,188 km<span class="hlt">2</span>) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km<span class="hlt">2</span> (30.<span class="hlt">2</span>%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km<span class="hlt">2</span>, representing 10% of Borneo. We mapped 271,819 km of primary <span class="hlt">logging</span> roads that were created between 1973 and 2010. The greatest density of <span class="hlt">logging</span> roads was found in Sarawak, at 0.89 km km-<span class="hlt">2</span>, and the lowest density in Brunei, at 0.18 km km-<span class="hlt">2</span>. Analyzing MODIS-based tree cover maps, we estimate that <span class="hlt">logging</span> operated within 700 m of primary <span class="hlt">logging</span> roads. Using this distance, we estimate that 266,257 km<span class="hlt">2</span> of 1973 forest cover has been <span class="hlt">logged</span>. With 389,566 km<span class="hlt">2</span> (52.8%) of the island remaining forested, of which 209,649 km<span class="hlt">2</span> remains intact. There is still hope for biodiversity conservation in Borneo. Protecting <span class="hlt">logged</span> forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4100734','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4100734"><span>Four Decades of Forest Persistence, Clearance and <span class="hlt">Logging</span> on Borneo</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Gaveau, David L. A.; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K.; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik</p> <p>2014-01-01</p> <p>The native forests of Borneo have been impacted by selective <span class="hlt">logging</span>, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or <span class="hlt">logging</span> since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively <span class="hlt">logged</span> forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km<span class="hlt">2</span>) of Borneo's area (737,188 km<span class="hlt">2</span>) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km<span class="hlt">2</span> (30.<span class="hlt">2</span>%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km<span class="hlt">2</span>, representing 10% of Borneo. We mapped 271,819 km of primary <span class="hlt">logging</span> roads that were created between 1973 and 2010. The greatest density of <span class="hlt">logging</span> roads was found in Sarawak, at 0.89 km km−<span class="hlt">2</span>, and the lowest density in Brunei, at 0.18 km km−<span class="hlt">2</span>. Analyzing MODIS-based tree cover maps, we estimate that <span class="hlt">logging</span> operated within 700 m of primary <span class="hlt">logging</span> roads. Using this distance, we estimate that 266,257 km<span class="hlt">2</span> of 1973 forest cover has been <span class="hlt">logged</span>. With 389,566 km<span class="hlt">2</span> (52.8%) of the island remaining forested, of which 209,649 km<span class="hlt">2</span> remains intact. There is still hope for biodiversity conservation in Borneo. Protecting <span class="hlt">logged</span> forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo. PMID:25029192</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27760490','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27760490"><span>The word frequency effect during sentence reading: A linear or nonlinear effect of <span class="hlt">log</span> frequency?</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>White, Sarah J; Drieghe, Denis; Liversedge, Simon P; Staub, Adrian</p> <p>2016-10-20</p> <p>The effect of word frequency on eye movement behaviour during reading has been reported in many experimental studies. However, the vast majority of these studies compared only two levels of word frequency (high and low). Here we assess whether the effect of <span class="hlt">log</span> word frequency on eye movement measures is linear, in an experiment in which a critical target word in each sentence was at one of three approximately equally spaced <span class="hlt">log</span> frequency levels. Separate analyses treated <span class="hlt">log</span> frequency as a categorical or a continuous predictor. Both analyses showed only a linear effect of <span class="hlt">log</span> frequency on the <span class="hlt">likelihood</span> of skipping a word, and on first fixation duration. Ex-Gaussian analyses of first fixation duration showed similar effects on distributional parameters in comparing high- and medium-frequency words, and medium- and low-frequency words. Analyses of gaze duration and the probability of a refixation suggested a nonlinear pattern, with a larger effect at the lower end of the <span class="hlt">log</span> frequency scale. However, the nonlinear effects were small, and Bayes Factor analyses favoured the simpler linear models for all measures. The possible roles of lexical and post-lexical factors in producing nonlinear effects of <span class="hlt">log</span> word frequency during sentence reading are discussed.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/982738','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/982738"><span>Correlating <span class="hlt">Log</span> Messages for System Diagnostics</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Gunasekaran, Raghul; Dillow, David A; Shipman, Galen M; Maxwell, Don E; Hill, Jason J; Park, Byung H; Geist, Al</p> <p>2010-01-01</p> <p>In large-scale computing systems, the sheer volume of <span class="hlt">log</span> data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand <span class="hlt">log</span> entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of <span class="hlt">log</span> messages. Most often, the <span class="hlt">log</span> messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting <span class="hlt">log</span> messages by identifying their commonalities and grouping them into clusters. Given a set of <span class="hlt">log</span> messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre <span class="hlt">log</span> messages in the XT5 console <span class="hlt">log</span> and show that such grouping of <span class="hlt">log</span> messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the <span class="hlt">log</span>, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26545922','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26545922"><span>Computationally Efficient Composite <span class="hlt">Likelihood</span> Statistics for Demographic Inference.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Coffman, Alec J; Hsieh, Ping Hsun; Gravel, Simon; Gutenkunst, Ryan N</p> <p>2016-02-01</p> <p>Many population genetics tools employ composite <span class="hlt">likelihoods</span>, because fully modeling genomic linkage is challenging. But traditional approaches to estimating parameter uncertainties and performing model selection require full <span class="hlt">likelihoods</span>, so these tools have relied on computationally expensive maximum-<span class="hlt">likelihood</span> estimation (MLE) on bootstrapped data. Here, we demonstrate that statistical theory can be applied to adjust composite <span class="hlt">likelihoods</span> and perform robust computationally efficient statistical inference in two demographic inference tools: ∂a∂i and TRACTS. On both simulated and real data, the adjustments perform comparably to MLE bootstrapping while using orders of magnitude less computational time.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/866845','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/servlets/purl/866845"><span>Dual excitation acoustic paramagnetic <span class="hlt">logging</span> tool</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Vail, III, William B.</p> <p>1989-01-01</p> <p>New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic <span class="hlt">Logging</span> Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. <span class="hlt">2</span> kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic <span class="hlt">Logging</span> Tool employing two acoustic sources is also described.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/7271362','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/biblio/7271362"><span>Dual excitation acoustic paramagnetic <span class="hlt">logging</span> tool</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Vail, W.B. III.</p> <p>1989-02-14</p> <p>New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic <span class="hlt">Logging</span> Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. <span class="hlt">2</span> kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic <span class="hlt">Logging</span> Tool employing two acoustic sources is also described. 6 figs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JAG...135..111G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JAG...135..111G"><span>4D reservoir characterization using well <span class="hlt">log</span> data for feasible CO<span class="hlt">2</span>-enhanced oil recovery at Ankleshwar, Cambay Basin - A rock physics diagnostic and modeling approach</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ganguli, Shib Sankar; Vedanti, Nimisha; Dimri, V. P.</p> <p>2016-12-01</p> <p>In recent years, rock physics modeling has become an integral part of reservoir characterization as it provides the fundamental relationship between geophysical measurements and reservoir rock properties. These models are also used to quantify the effect of fluid saturation and stress on reservoir rocks by tracking the changes in elastic properties during production. Additionally, various rock physics models can be applied to obtain the information of rock properties away from existing drilled wells, which can play a crucial role in the feasibility assessment of CO<span class="hlt">2</span>-enhanced oil recovery (EOR) operation at field. Thus, the objective of this study is to develop a rock-physics model of the Ankleshwar reservoir to predict the reservoir response under CO<span class="hlt">2</span>-EOR. The Ankleshwar oil field is a mature field situated in Cambay Basin (Western India) that witnessed massive peripheral water flooding for around 40 years. Since the field was under water flooding for a long term, reasonable changes in reservoir elastic properties might have occurred. To identify potential reservoir zone with significant bypassed (or residual) oil saturation, we applied the diagnostic rock physics models to two available wells from the Ankleshwar oil field. The results clearly indicate transitions from clean sands to shaly sands at the base, and from sandy shale to pure shale at the top of the reservoir pay zone, suggesting a different seismic response at the top when compared to the base of the reservoir in both the wells. We also found that clay content and sorting affects the elastic properties of these sands, indicating different depositional scenario for the oil sands encountered in the Ankleshwar formation. Nevertheless, the rock physics template (RPT) analysis of the well data provides valuable information about the residual oil zone, a potential target for CO<span class="hlt">2</span>-EOR. Further, a 4D reservoir characterization study has been conducted to assess the seismic detectability of CO<span class="hlt">2</span>-EOR, and we</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/6861734','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/6861734"><span>Utilizing magnetic resonance imaging <span class="hlt">logs</span>, openhole <span class="hlt">logs</span>, and sidewall core analyses to evaluate shaly sands for water-free production</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )</p> <p>1996-01-01</p> <p>Nuclear magnetic resonance (NMR) <span class="hlt">logging</span> using the new C Series Magnetic Resonance Imaging <span class="hlt">Log</span> (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole <span class="hlt">logs</span> may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and <span class="hlt">log</span> presentations used with conventional <span class="hlt">logs</span> and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T<span class="hlt">2</span> distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional <span class="hlt">logs</span> and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/425899','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/425899"><span>Utilizing magnetic resonance imaging <span class="hlt">logs</span>, openhole <span class="hlt">logs</span>, and sidewall core analyses to evaluate shaly sands for water-free production</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.</p> <p>1996-12-31</p> <p>Nuclear magnetic resonance (NMR) <span class="hlt">logging</span> using the new C Series Magnetic Resonance Imaging <span class="hlt">Log</span> (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole <span class="hlt">logs</span> may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and <span class="hlt">log</span> presentations used with conventional <span class="hlt">logs</span> and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T<span class="hlt">2</span> distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional <span class="hlt">logs</span> and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1999PhyA..273..439K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1999PhyA..273..439K"><span>Evaluating maximum <span class="hlt">likelihood</span> estimation methods to determine the hurst coefficients</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kendziorski, C. M.; Bassingthwaighte, J. B.; Tonellato, P. J.</p> <p>1999-12-01</p> <p>A maximum <span class="hlt">likelihood</span> estimation method implemented in S-PLUS ( S-MLE) to estimate the Hurst coefficient ( H) is evaluated. The Hurst coefficient, with 0.5< H<1, characterizes long memory time series by quantifying the rate of decay of the autocorrelation function. S-MLE was developed to estimate H for fractionally differenced (fd) processes. However, in practice it is difficult to distinguish between fd processes and fractional Gaussian noise (fGn) processes. Thus, the method is evaluated for estimating H for both fd and fGn processes. S-MLE gave biased results of H for fGn processes of any length and for fd processes of lengths less than <span class="hlt">2</span> 10. A modified method is proposed to correct for this bias. It gives reliable estimates of H for both fd and fGn processes of length greater than or equal to <span class="hlt">2</span> 11.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26487052','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26487052"><span>Multi<span class="hlt">Log</span>: a tool for the control and output merging of multiple <span class="hlt">logging</span> applications.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Woodruff, Jonathan; Alexander, Jason</p> <p>2016-12-01</p> <p>Multi<span class="hlt">Log</span> is a <span class="hlt">logging</span> tool that controls, gathers, and combines the output, on-the-fly, from existing research and commercial <span class="hlt">logging</span> applications or "loggers." Loggers record a specific set of user actions on a computing device, helping researchers to better understand environments or interactions, guiding the design of new or improved interfaces and applications. Multi<span class="hlt">Log</span> reduces researchers' required implementation effort by simplifying the set-up of multiple loggers and seamlessly combining their output. This in turn increases the availability of <span class="hlt">logging</span> systems to non-technical experimenters for both short-term and longitudinal observation studies. Multi<span class="hlt">Log</span> supports two operating modes: "researcher mode" where experimenters configure multiple <span class="hlt">logging</span> systems, and "deployment mode" where the system is deployed to user-study participants' systems. Researcher mode allows researchers to install, configure <span class="hlt">log</span> filtering and obfuscation, observe near real-time event streams, and save configuration files ready for deployment. Deployment mode simplifies data collection from multiple loggers by running in the system tray at user <span class="hlt">log</span>-in, starting loggers, combining their output, and securely uploading the data to a web-server. It also supports real-time browsing of <span class="hlt">log</span> data, pausing of <span class="hlt">logging</span>, and removal of <span class="hlt">log</span> lines. Performance evaluations show that Multi<span class="hlt">Log</span> does not adversely affect system performance, even when simultaneously running several <span class="hlt">logging</span> systems. Initial studies show the system runs reliably over a period of 10 weeks.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD0433172','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD0433172"><span><span class="hlt">LOG</span> PERIODIC DIPOLE ARRAY WITH PARASITIC ELEMENTS</span></a></p> <p><a target="_blank" href="https://publicaccess.dtic.mil/psm/api/service/search/search">DTIC Science & Technology</a></p> <p></p> <p></p> <p>The design and measured characteristics of dipole and monopole versions of a <span class="hlt">log</span> periodic array with parasitic elements are discussed. In a dipole...for the elements to obtain <span class="hlt">log</span> periodic performance of the anntenna. This design with parasitic elements lends itself to a monopole version of the...antenna which has a simplified feeding configuration. The result is a <span class="hlt">log</span> periodic antenna design that can be used from high frequencies through microwave frequencies.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014yCat....102028C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014yCat....102028C"><span>VizieR Online Data Catalog: CoRoT observation <span class="hlt">log</span> (N<span class="hlt">2</span>-4.4) (CoRoT 2016)</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>COROT Team</p> <p>2014-03-01</p> <p>CoRoT, a space astronomy mission, has measured photometric micro-variability of stars from minutes to months (up to 150 days) with a high duty cycle (more than 90%). The mission was led by CNES in association with four French laboratories and 7 participating countries and agencies (Austria, Belgium, Brazil, Germany, Spain, and the ESA Science Programme). The satellite was composed of a PROTEUS platform (the 3rd in the series) and a unique instrument: a stellar rapid photometer. It was launched on December 27th 2006 by a Soyuz Rocket, from Bakonour. The mission has lasted almost 6 years (the nominal 3-year duration and a 3-year extension) and has observed more than 160 000 stars. It stopped sending data on November <span class="hlt">2</span>nd 2012. Two regions of the sky were accessible for long period of time: circles of 10 degrees centered on the equator around alpha=06:50 and alpha=18:50. They were called the CoRoT eyes: the "anticenter" and the "center eye" (as they are approximately in these directions). Each pointing covers 1.4x<span class="hlt">2</span>.8 square degrees within one of those CoRoT eyes. The original scientific objectives were focussed on the study of stellar pulsations (asteroseismology) to probe the internal structure of stars, and the detection of small exoplanets through their "transiting in front of their host star, and the measurement of their size. This lead to introduce two modes of observations, working simultaneously: - The "bright star" mode dedicated to very precise seismology of a small sample of bright and closeby stars - The "faint star" mode, observing a very large number of stars at the same time, to detect transits, which are rare events, as they imply the alignment of the star, the planet and the observer. The large amount of data gathered in this mode turned out to be extremely fruitful for many topics of stellar physics. Beyond these two initial objectives, CoRoT data revealed stellar variability associated with various other phenomena: granulation, rotational modulation</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol5/pdf/CFR-2012-title47-vol5-sec80-409.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol5/pdf/CFR-2012-title47-vol5-sec80-409.pdf"><span>47 CFR 80.409 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... principal radiotelegraph operating room during the voyage. (c) Public coast station <span class="hlt">logs</span>. Public coast... made comparing the radio station clock with standard time, including errors observed and...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol5/pdf/CFR-2013-title47-vol5-sec80-409.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol5/pdf/CFR-2013-title47-vol5-sec80-409.pdf"><span>47 CFR 80.409 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-10-01</p> <p>... principal radiotelegraph operating room during the voyage. (c) Public coast station <span class="hlt">logs</span>. Public coast... made comparing the radio station clock with standard time, including errors observed and...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol5/pdf/CFR-2014-title47-vol5-sec80-409.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol5/pdf/CFR-2014-title47-vol5-sec80-409.pdf"><span>47 CFR 80.409 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... principal radiotelegraph operating room during the voyage. (c) Public coast station <span class="hlt">logs</span>. Public coast... made comparing the radio station clock with standard time, including errors observed and...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol5/pdf/CFR-2010-title47-vol5-sec80-409.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol5/pdf/CFR-2010-title47-vol5-sec80-409.pdf"><span>47 CFR 80.409 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... principal radiotelegraph operating room during the voyage. (c) Public coast station <span class="hlt">logs</span>. Public coast... made comparing the radio station clock with standard time, including errors observed and...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol5/pdf/CFR-2011-title47-vol5-sec80-409.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol5/pdf/CFR-2011-title47-vol5-sec80-409.pdf"><span>47 CFR 80.409 - Station <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... principal radiotelegraph operating room during the voyage. (c) Public coast station <span class="hlt">logs</span>. Public coast... made comparing the radio station clock with standard time, including errors observed and...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930015560','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930015560"><span>Optimal message <span class="hlt">log</span> reclamation for independent checkpointing</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wang, Yi-Min; Fuchs, W. Kent</p> <p>1993-01-01</p> <p>Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message <span class="hlt">logs</span>. In most research on check pointing and recovery, it was assumed that only the checkpoints and message <span class="hlt">logs</span> older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message <span class="hlt">logs</span>, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message <span class="hlt">log</span> reclamation.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/5464065','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/5464065"><span><span class="hlt">Logging</span> with coiled tubing less effective than with drill pipe</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Van Den Bosch, R. )</p> <p>1994-01-31</p> <p>Coiled tubing offered neither economic nor operational advantages over drill pipe for conveying <span class="hlt">logging</span> tools in open hole shallow horizontal wells in Germany. In the past <span class="hlt">2</span> years, Mobil Erdgas-Erdoel GMbH (MEEG) participated in completing eight shallow horizontal wells. These were medium-to-short radius wells at measured depths of between 850 and <span class="hlt">2</span>,000 m. The average horizontal section was 350 m. The <span class="hlt">logging</span> tools were conveyed by coiled tubing or drill pipe. MEEG attempted to <span class="hlt">log</span> five wells with coiled tubing-conveyed tools, four with 1 1/<span class="hlt">2</span>-in. tubing. Total depth was reached reliably in only one well, the shallowest and with the shortest horizontal section. Simulation programs were unreliable for calculating the downhole forces of the coil/tool combination or predicting possible helical lockups. In wells with drill pipe-conveyed <span class="hlt">logs</span>, the tool combination could always be pushed to total depth, and the operations were generally faster and cost less than <span class="hlt">logging</span> with coiled tubing. Also, drill pipe allowed longer and heavier tool strings. For reliable operations, coiled tubing needs to be more rigid, rig-up/rig-down times need to be improved, and the simulation programs must be more reliable for predicting downhole lock-up.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080004142','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080004142"><span>Flow rate <span class="hlt">logging</span> seepage meter</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Reay, William G. (Inventor); Walthall, Harry G. (Inventor)</p> <p>1996-01-01</p> <p>An apparatus for remotely measuring and <span class="hlt">logging</span> the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pa.water.usgs.gov/reports/wrir01-4264.pdf+','USGSPUBS'); return false;" href="http://pa.water.usgs.gov/reports/wrir01-4264.pdf+"><span>Interpretation of borehole geophysical <span class="hlt">logs</span>, aquifer-isolation tests, and water quality, supply wells 1 and <span class="hlt">2</span>, Willow Grove Naval Air Station/Joint Reserve Base, Horsham Township, Montgomery County, Pennsylvania</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Sloto, Ronald A.; Goode, Daniel J.; Frasch, Steven M.</p> <p>2002-01-01</p> <p>Ground water pumped from supply wells 1 and <span class="hlt">2</span> on the Willow Grove Naval Air Station/Joint Reserve Base (NAS/JRB) provides water for use at the base, including potable water for drinking. The supply wells have been contaminated by volatile organic compounds (VOC?s), particularly trichloroethylene (TCE) and tetrachloroethylene (PCE), and the water is treated to remove the VOC?s. The Willow Grove NAS/JRB and surrounding area are underlain by sedimentary rocks of the Triassic-age Stockton Formation, which form a complex, heterogeneous aquifer. The ground-water-flow system for the supply wells was characterized by use of borehole geophysical <span class="hlt">logs</span> and heatpulse-flowmeter measurements. The heatpulse-flowmeter measurements showed upward and downward borehole flow under nonpumping conditions in both wells. The hydraulic and chemical properties of discrete water-bearing fractures in the supply wells were characterized by isolating each water-bearing fracture with straddle packers. Eight fractures in supply well 1 and five fractures in supply well <span class="hlt">2</span> were selected for testing on the basis of the borehole geophysical <span class="hlt">logs</span> and borehole television surveys. Water samples were collected from each isolated fracture and analyzed for VOC?s and inorganic constituents. Fractures at 50?59, 79?80, 196, 124?152, 182, 241, 256, and 350?354 ft btoc (feet below top of casing) were isolated in supply well 1. Specific capacities ranged from 0.26 to 5.7 (gal/min)/ft (gallons per minute per foot) of drawdown. The highest specific capacity was for the fracture isolated at 179.8?188 ft btoc. Specific capacity and depth of fracture were not related in either supply well. The highest concentrations of PCE were in water samples collected from fractures isolated at 236.8?245 and 249.8?258 ft btoc, which are hydraulically connected. The concentration of PCE generally increased with depth to a maximum of 39 mg/L (micrograms per liter) at a depth of 249.8? 258 ft btoc and then decreased to 21 mg/L at a</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3501762','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3501762"><span>Nonparametric Bayes Factors Based On Empirical <span class="hlt">Likelihood</span> Ratios</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Vexler, Albert; Deng, Wei; Wilding, Gregory E.</p> <p>2012-01-01</p> <p>Bayes methodology provides posterior distribution functions based on parametric <span class="hlt">likelihoods</span> adjusted for prior distributions. A distribution-free alternative to the parametric <span class="hlt">likelihood</span> is use of empirical <span class="hlt">likelihood</span> (EL) techniques, well known in the context of nonparametric testing of statistical hypotheses. Empirical <span class="hlt">likelihoods</span> have been shown to exhibit many of the properties of conventional parametric <span class="hlt">likelihoods</span>. In this article, we propose and examine Bayes factors (BF) methods that are derived via the EL ratio approach. Following Kass & Wasserman [10], we consider Bayes factors type decision rules in the context of standard statistical testing techniques. We show that the asymptotic properties of the proposed procedure are similar to the classical BF’s asymptotic operating characteristics. Although we focus on hypothesis testing, the proposed approach also yields confidence interval estimators of unknown parameters. Monte Carlo simulations were conducted to evaluate the theoretical results as well as to demonstrate the power of the proposed test. PMID:23180904</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol23/pdf/CFR-2011-title40-vol23-sec146-87.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol23/pdf/CFR-2011-title40-vol23-sec146-87.pdf"><span>40 CFR 146.87 - <span class="hlt">Logging</span>, sampling, and testing prior to injection well operation.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... casing: (i) Resistivity, spontaneous potential, porosity, caliper, gamma ray, fracture finder <span class="hlt">logs</span>, and... confining zone(s): (1) Fracture pressure; (<span class="hlt">2</span>) Other physical and chemical characteristics of the...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/188911','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/188911"><span>Geological and petrophysical characterization of the Ferron Sandstone for 3-D simulation of a fluvial-deltaic reservoir. Deliverable <span class="hlt">2</span>.5.4, Ferron Sandstone lithologic strip <span class="hlt">logs</span>, Emergy & Sevier Counties, Utah: Volume I</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Allison, M.L.</p> <p>1995-12-08</p> <p>Strip <span class="hlt">logs</span> for 491 wells were produced from a digital subsurface database of lithologic descriptions of the Ferron Sandstone Member of the Mancos Shale. This subsurface database covers wells from the parts of Emery and Sevier Counties in central Utah that occur between Ferron Creek on the north and Last Chance Creek on the south. The lithologic descriptions were imported into a <span class="hlt">logging</span> software application designed for the display of stratigraphic data. Strip <span class="hlt">logs</span> were produced at a scale of one inch equals 20 feet. The strip <span class="hlt">logs</span> were created as part of a study by the Utah Geological Survey to develop a comprehensive, interdisciplinary, and qualitative characterization of a fluvial-deltaic reservoir using the Ferron Sandstone as a surface analogue. The study was funded by the U.S. Department of Energy (DOE) under the Geoscience/Engineering Reservoir Characterization Program.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23901949','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23901949"><span>Alternative methods for estimating common descriptors for QSAR studies of dyes and fluorescent probes using molecular modeling software. <span class="hlt">2</span>. Correlations between <span class="hlt">log</span> P and the hydrophilic/lipophilic index, and new methods for estimating degrees of amphiphilicity.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dapson, Richard W; Horobin, Richard W</p> <p>2013-11-01</p> <p>The <span class="hlt">log</span> P descriptor, despite its usefulness, can be difficult to use, especially for researchers lacking skills in physical chemistry. Moreover this classic measure has been determined in numerous ways, which can result in inconsistant estimates of <span class="hlt">log</span> P values, especially for relatively complex molecules such as fluorescent probes. Novel measures of hydrophilicity/lipophilicity (the Hydrophilic/Lipophilic Index, HLI) and amphiphilicity (hydrophilic/lipophilic indices for the head group and tail, HLIT and HLIHG, respectively) therefore have been devised. We compare these descriptors with measures based on <span class="hlt">log</span> P, the standard method for quantitative structure activity relationships (QSAR) studies. HLI can be determined using widely available molecular modeling software, coupled with simple arithmetic calculations. It is based on partial atomic charges and is intended to be a stand-alone measure of hydrophilicity/lipophilicity. Given the wide application of <span class="hlt">log</span> P, however, we investigated the correlation between HLI and <span class="hlt">log</span> P using a test set of 56 fluorescent probes of widely different physicochemical character. Overall correlation was poor; however, correlation of HLI and <span class="hlt">log</span> P for probes of narrowly specified charge types, i.e., non-ionic compounds, anions, conjugated cations, or zwitterions, was excellent. Values for probes with additional nonconjugated quaternary cations, however, were less well correlated. The newly devised HLI can be divided into domain-specific descriptors, HLIT and HLIHG in amphiphilic probes. Determinations of amphiphilicity, made independently by the authors using their respective methods, showed excellent agreement. Quantifying amphiphilicity from partial <span class="hlt">log</span> P values of the head group (head group hydrophilicity; HGH) and tail (amphiphilicity index; AI) has proved useful for understanding fluorescent probe action. The same limitations of <span class="hlt">log</span> P apply to HGH and AI, however. The novel descriptors, HLIT and HLIHG, offer analogous advantages</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19850062616&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dlog','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19850062616&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dlog"><span>Tolerance bounds for <span class="hlt">log</span> gamma regression models</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Jones, R. A.; Scholz, F. W.; Ossiander, M.; Shorack, G. R.</p> <p>1985-01-01</p> <p>The present procedure for finding lower confidence bounds for the quantiles of Weibull populations, on the basis of the solution of a quadratic equation, is more accurate than current Monte Carlo tables and extends to any location-scale family. It is shown that this method is accurate for all members of the <span class="hlt">log</span> gamma(K) family, where K = 1/<span class="hlt">2</span> to infinity, and works well for censored data, while also extending to regression data. An even more accurate procedure involving an approximation to the Lawless (1982) conditional procedure, with numerical integrations whose tables are independent of the data, is also presented. These methods are applied to the case of failure strengths of ceramic specimens from each of three billets of Si3N4, which have undergone flexural strength testing.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec90-412.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec90-412.pdf"><span>40 CFR 90.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data <span class="hlt">logging</span>. 90.412 Section 90.412....412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec90-412.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec90-412.pdf"><span>40 CFR 90.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data <span class="hlt">logging</span>. 90.412 Section 90.412....412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec90-412.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec90-412.pdf"><span>40 CFR 90.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data <span class="hlt">logging</span>. 90.412 Section 90.412....412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec89-409.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec89-409.pdf"><span>40 CFR 89.409 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data <span class="hlt">logging</span>. 89.409 Section 89.409... Data <span class="hlt">logging</span>. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec91-412.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title40-vol20/pdf/CFR-2011-title40-vol20-sec91-412.pdf"><span>40 CFR 91.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data <span class="hlt">logging</span>. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec91-412.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec91-412.pdf"><span>40 CFR 91.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data <span class="hlt">logging</span>. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec89-409.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec89-409.pdf"><span>40 CFR 89.409 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data <span class="hlt">logging</span>. 89.409 Section 89.409... Data <span class="hlt">logging</span>. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec90-412.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec90-412.pdf"><span>40 CFR 90.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data <span class="hlt">logging</span>. 90.412 Section 90.412....412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec89-409.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec89-409.pdf"><span>40 CFR 89.409 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data <span class="hlt">logging</span>. 89.409 Section 89.409... Data <span class="hlt">logging</span>. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec89-409.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec89-409.pdf"><span>40 CFR 89.409 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data <span class="hlt">logging</span>. 89.409 Section 89.409... Data <span class="hlt">logging</span>. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec91-412.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol21/pdf/CFR-2013-title40-vol21-sec91-412.pdf"><span>40 CFR 91.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data <span class="hlt">logging</span>. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec91-412.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title40-vol21/pdf/CFR-2012-title40-vol21-sec91-412.pdf"><span>40 CFR 91.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data <span class="hlt">logging</span>. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec89-409.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title40-vol20/pdf/CFR-2010-title40-vol20-sec89-409.pdf"><span>40 CFR 89.409 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data <span class="hlt">logging</span>. 89.409 Section 89.409... Data <span class="hlt">logging</span>. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec90-412.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec90-412.pdf"><span>40 CFR 90.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data <span class="hlt">logging</span>. 90.412 Section 90.412....412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec91-412.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title40-vol20/pdf/CFR-2014-title40-vol20-sec91-412.pdf"><span>40 CFR 91.412 - Data <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data <span class="hlt">logging</span>. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data <span class="hlt">logging</span>. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED447020.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED447020.pdf"><span>Discover Presidential <span class="hlt">Log</span> Cabins. Teacher's Discussion Guide.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>National Park Service (Dept. of Interior), Washington, DC.</p> <p></p> <p>Discover Presidential <span class="hlt">Log</span> Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three <span class="hlt">log</span> cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JHyd..540..549N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JHyd..540..549N"><span>Effect of formal and informal <span class="hlt">likelihood</span> functions on uncertainty assessment in a single event rainfall-runoff model</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran</p> <p>2016-09-01</p> <p>In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal <span class="hlt">likelihood</span> functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different <span class="hlt">likelihood</span> functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four <span class="hlt">likelihood</span> functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum <span class="hlt">likelihood</span> error (HMLE) estimator. Finally, in <span class="hlt">likelihood</span> function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the <span class="hlt">likelihood</span> function, and vary for different <span class="hlt">likelihood</span> functions. Most of the parameters were better defined by formal <span class="hlt">likelihood</span> functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal <span class="hlt">likelihood</span> functions L1, L<span class="hlt">2</span>, L3, L4 and the formal <span class="hlt">likelihood</span> function L6 are approximately the same for most of the sub-basins, and these <span class="hlt">likelihood</span> functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal <span class="hlt">likelihood</span> functions L5 and L7</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=log&pg=3&id=EJ849565','ERIC'); return false;" href="http://eric.ed.gov/?q=log&pg=3&id=EJ849565"><span>Designing and Piloting a Leadership Daily Practice <span class="hlt">Log</span>: Using <span class="hlt">Logs</span> to Study the Practice of Leadership</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Spillane, James P.; Zuberi, Anita</p> <p>2009-01-01</p> <p>Purpose: This article aims to validate the Leadership Daily Practice (LDP) <span class="hlt">log</span>, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice <span class="hlt">log</span>, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP <span class="hlt">log</span>. Participants: Formal…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JEI....24e3014L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JEI....24e3014L"><span><span class="hlt">Log</span>-Gabor Weber descriptor for face recognition</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, Jing; Sang, Nong; Gao, Changxin</p> <p>2015-09-01</p> <p>The <span class="hlt">Log</span>-Gabor transform, which is suitable for analyzing gradually changing data such as in iris and face images, has been widely used in image processing, pattern recognition, and computer vision. In most cases, only the magnitude or phase information of the <span class="hlt">Log</span>-Gabor transform is considered. However, the complementary effect taken by combining magnitude and phase information simultaneously for an image-feature extraction problem has not been systematically explored in the existing works. We propose a local image descriptor for face recognition, called <span class="hlt">Log</span>-Gabor Weber descriptor (LGWD). The novelty of our LGWD is twofold: (1) to fully utilize the information from the magnitude or phase feature of multiscale and orientation <span class="hlt">Log</span>-Gabor transform, we apply the Weber local binary pattern operator to each transform response. (<span class="hlt">2</span>) The encoded <span class="hlt">Log</span>-Gabor magnitude and phase information are fused at the feature level by utilizing kernel canonical correlation analysis strategy, considering that feature level information fusion is effective when the modalities are correlated. Experimental results on the AR, Extended Yale B, and UMIST face databases, compared with those available from recent experiments reported in the literature, show that our descriptor yields a better performance than state-of-the art methods.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PMB....61L..11B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PMB....61L..11B"><span>Maximum-<span class="hlt">likelihood</span> joint image reconstruction and motion estimation with misaligned attenuation in TOF-PET/CT</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bousse, Alexandre; Bertolli, Ottavia; Atkinson, David; Arridge, Simon; Ourselin, Sébastien; Hutton, Brian F.; Thielemans, Kris</p> <p>2016-02-01</p> <p>This work is an extension of our recent work on joint activity reconstruction/motion estimation (JRM) from positron emission tomography (PET) data. We performed JRM by maximization of the penalized <span class="hlt">log-likelihood</span> in which the probabilistic model assumes that the same motion field affects both the activity distribution and the attenuation map. Our previous results showed that JRM can successfully reconstruct the activity distribution when the attenuation map is misaligned with the PET data, but converges slowly due to the significant cross-talk in the <span class="hlt">likelihood</span>. In this paper, we utilize time-of-flight PET for JRM and demonstrate that the convergence speed is significantly improved compared to JRM with conventional PET data.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.usgs.gov/of/2009/1228/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/2009/1228/"><span><span class="hlt">Logs</span> of Paleoseismic Excavations Across the Central Range Fault, Trinidad</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Crosby, Christopher J.; Prentice, Carol S.; Weber, John; Ragona, Daniel</p> <p>2009-01-01</p> <p> Table 1 are shown in red on the trench <span class="hlt">logs</span>. All radiocarbon ages are calibrated and given with <span class="hlt">2</span> standard deviation age ranges. Our studies suggest that the Central Range Fault is a Holocene fault capable of producing damaging earthquakes in Trinidad</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26837056','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26837056"><span><span class="hlt">Likelihood</span> ratio meta-analysis: New motivation and approach for an old method.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dormuth, Colin R; Filion, Kristian B; Platt, Robert W</p> <p>2016-03-01</p> <p>A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the <span class="hlt">likelihood</span> ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the <span class="hlt">log-likelihood</span> ratio (<span class="hlt">Log</span>LR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/10570017','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/10570017"><span>Detection of quantitative trait loci affecting caffeine metabolism by interval mapping in a genome-wide scan of C3H/HeJ x APN F(<span class="hlt">2</span>) mice.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Casley, W L; Menzies, J A; Whitehouse, L W; Moon, T W</p> <p>1999-12-01</p> <p>Caffeine metabolite ratios have been widely used to measure cytochrome P-450 1A<span class="hlt">2</span> activity in humans. Serum paraxanthine/caffeine ratio is one such index of this activity. We had previously demonstrated genetic variation of this trait among inbred mouse strains. In the present study, we have undertaken a genome-wide scan for quantitative trait loci affecting this trait with an interval mapping approach on an F(<span class="hlt">2</span>) intercross population of acetaminophen nonsusceptible and C3H/HeJ inbred mice. A statistically significant association (<span class="hlt">log-likelihood</span> ratio = 25.0) between a locus on chromosome 9, which colocalized with the murine Cyp1a<span class="hlt">2</span> locus, and the plasma paraxanthine/caffeine ratio was identified. This result suggested the presence of an expression polymorphism affecting this gene. A second locus was identified on chromosome 1 (<span class="hlt">log-likelihood</span> ratio = 9.7) for which no obvious candidate gene has been identified. The influence of this locus on the paraxanthine/caffeine index was more significant among males (<span class="hlt">log-likelihood</span> ratio = 6.3) than females (<span class="hlt">log-likelihood</span> ratio = 3.6). A third locus was identified on chromosome 4 with a less statistically robust association (<span class="hlt">log-likelihood</span> ratio = 3.4) to the paraxanthine/caffeine phenotype. Collectively, these three loci accounted for 63.<span class="hlt">2</span>% of the variation observed in the F(<span class="hlt">2</span>) population for this phenotype. These results demonstrate the potential for genetic variation arising from factors other than CYP1A<span class="hlt">2</span> activity to influence the plasma paraxanthine/caffeine ratio in mice. This study demonstrates the utility of quantitative genetics in the analysis of polygenic drug metabolism.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JSG....95...17R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JSG....95...17R"><span>Benefits of maximum <span class="hlt">likelihood</span> estimators for fracture attribute analysis: Implications for permeability and up-scaling</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rizzo, R. E.; Healy, D.; De Siena, L.</p> <p>2017-02-01</p> <p>The success of any predictive model is largely dependent on the accuracy with which its parameters are known. When characterising fracture networks in rocks, one of the main issues is accurately scaling the parameters governing the distribution of fracture attributes. Optimal characterisation and analysis of fracture lengths and apertures are fundamental to estimate bulk permeability and therefore fluid flow, especially for rocks with low primary porosity where most of the flow takes place within fractures. We collected outcrop data from a fractured upper Miocene biosiliceous mudstone formation (California, USA), which exhibits seepage of bitumen-rich fluids through the fractures. The dataset was analysed using Maximum <span class="hlt">Likelihood</span> Estimators to extract the underlying scaling parameters, and we found a <span class="hlt">log</span>-normal distribution to be the best representative statistic for both fracture lengths and apertures in the study area. By applying Maximum <span class="hlt">Likelihood</span> Estimators on outcrop fracture data, we generate fracture network models with the same statistical attributes to the ones observed on outcrop, from which we can achieve more robust predictions of bulk permeability.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/822236','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/822236"><span>A NEW <span class="hlt">LOG</span> EVALUATION METHOD TO APPRAISE MESAVERDE RE-COMPLETION OPPORTUNITES</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Albert Greer</p> <p>2002-09-11</p> <p>This report covers the information presented at the technical progress review. The following topics were discussed: (1) Structure of the study area dataset. (<span class="hlt">2</span>) A visual illustration of the typical <span class="hlt">logs</span>. (3) Crossplots developed and evaluated for patterns that might bear a relationship to initial producing rate. (4) Development of fuzzy curves to rank the goodness of the statistical properties of the <span class="hlt">logs</span> used to construct the crossplots. (5) Neural network correlations developed with statistical properties of the <span class="hlt">logs</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19810019287','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19810019287"><span>MAIL <span class="hlt">LOG</span>, program theory, volume 1. [Scout project automatic data system</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Harris, D. K.</p> <p>1979-01-01</p> <p>The program theory used to obtain the software package, MAIL <span class="hlt">LOG</span>, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL <span class="hlt">LOG</span> data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (<span class="hlt">2</span>) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL <span class="hlt">LOG</span> outputs are given and the data base design is described.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930006150','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930006150"><span>Nonblocking and orphan free message <span class="hlt">logging</span> protocols</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith</p> <p>1992-01-01</p> <p>Currently existing message <span class="hlt">logging</span> protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message <span class="hlt">logging</span>. We construct a message-<span class="hlt">logging</span> protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19920000489&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dlog','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19920000489&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dlog"><span>Recognizing Patterns In <span class="hlt">Log</span>-Polar Coordinates</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Weiman, Carl F. R.</p> <p>1992-01-01</p> <p><span class="hlt">Log</span>-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to <span class="hlt">log</span>-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. <span class="hlt">Log</span>-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19850024534','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19850024534"><span>Weibull distribution based on maximum <span class="hlt">likelihood</span> with interval inspection data</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Rheinfurth, M. H.</p> <p>1985-01-01</p> <p>The two Weibull parameters based upon the method of maximum <span class="hlt">likelihood</span> are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016LPICo1962.4110B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016LPICo1962.4110B"><span>The Multi-Mission Maximum <span class="hlt">Likelihood</span> Framework (3ML)</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Burgess, J. M.; Vianello, G.</p> <p>2016-10-01</p> <p>We introduce a new tool for multi-messenger astronomy capable of fitting data from multiple instruments properly via the use of independent <span class="hlt">likelihood</span> plugins. 3ML represents a step forward in spectral and spatial analysis across all wavelengths.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhRvA..95b2107S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhRvA..95b2107S"><span>Investigating bias in maximum-<span class="hlt">likelihood</span> quantum-state tomography</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Silva, G. B.; Glancy, S.; Vasconcelos, H. M.</p> <p>2017-02-01</p> <p>Maximum-<span class="hlt">likelihood</span> quantum-state tomography yields estimators that are consistent, provided that the <span class="hlt">likelihood</span> model is correct, but the maximum-<span class="hlt">likelihood</span> estimators may have bias for any finite data set. The bias of an estimator is the difference between the expected value of the estimate and the true value of the parameter being estimated. This paper investigates bias in the widely used maximum-<span class="hlt">likelihood</span> quantum-state tomography. Our goal is to understand how the amount of bias depends on factors such as the purity of the true state, the number of measurements performed, and the number of different bases in which the system is measured. For this, we perform numerical experiments that simulate optical homodyne tomography of squeezed thermal states under various conditions, perform tomography, and estimate bias in the purity of the estimated state. We find that estimates of higher purity states exhibit considerable bias, such that the estimates have lower purities than the true states.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ241969.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ241969.pdf"><span>Maximum <span class="hlt">Likelihood</span> Factor Structure of the Family Environment Scale.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Fowler, Patrick C.</p> <p>1981-01-01</p> <p>Presents the maximum <span class="hlt">likelihood</span> factor structure of the Family Environment Scale. The first bipolar dimension, "cohesion v conflict," measures relationship-centered concerns, while the second unipolar dimension is an index of "organizational and control" activities. (Author)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title10-vol1/pdf/CFR-2010-title10-vol1-sec39-13.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title10-vol1/pdf/CFR-2010-title10-vol1-sec39-13.pdf"><span>10 CFR 39.13 - Specific licenses for well <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-01-01</p> <p>... Energy NUCLEAR REGULATORY COMMISSION LICENSES AND RADIATION SAFETY REQUIREMENTS FOR WELL <span class="hlt">LOGGING</span> Specific... this chapter for special nuclear material, as appropriate, and any special requirements contained in... procedures to the Commission. The description must include the— (1) Instruments to be used; (<span class="hlt">2</span>) Methods...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=dummies&pg=3&id=EJ766181','ERIC'); return false;" href="http://eric.ed.gov/?q=dummies&pg=3&id=EJ766181"><span>Application Scenarios for Nonstandard <span class="hlt">Log</span>-Linear Models</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Mair, Patrick; von Eye, Alexander</p> <p>2007-01-01</p> <p>In this article, the authors have <span class="hlt">2</span> aims. First, hierarchical, nonhierarchical, and nonstandard <span class="hlt">log</span>-linear models are defined. Second, application scenarios are presented for nonhierarchical and nonstandard models, with illustrations of where these scenarios can occur. Parameters can be interpreted in regard to their formal meaning and in regard…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25107832','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25107832"><span>Improved maximum <span class="hlt">likelihood</span> reconstruction of complex multi-generational pedigrees.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sheehan, Nuala A; Bartlett, Mark; Cussens, James</p> <p>2014-11-01</p> <p>The reconstruction of pedigrees from genetic marker data is relevant to a wide range of applications. <span class="hlt">Likelihood</span>-based approaches aim to find the pedigree structure that gives the highest probability to the observed data. Existing methods either entail an exhaustive search and are hence restricted to small numbers of individuals, or they take a more heuristic approach and deliver a solution that will probably have high <span class="hlt">likelihood</span> but is not guaranteed to be optimal. By encoding the pedigree learning problem as an integer linear program we can exploit efficient optimisation algorithms to construct pedigrees guaranteed to have maximal <span class="hlt">likelihood</span> for the standard situation where we have complete marker data at unlinked loci and segregation of genes from parents to offspring is Mendelian. Previous work demonstrated efficient reconstruction of pedigrees of up to about 100 individuals. The modified method that we present here is not so restricted: we demonstrate its applicability with simulated data on a real human pedigree structure of over 1600 individuals. It also compares well with a very competitive approximate approach in terms of solving time and accuracy. In addition to identifying a maximum <span class="hlt">likelihood</span> pedigree, we can obtain any number of pedigrees in decreasing order of <span class="hlt">likelihood</span>. This is useful for assessing the uncertainty of a maximum <span class="hlt">likelihood</span> solution and permits model averaging over high <span class="hlt">likelihood</span> pedigrees when this would be appropriate. More importantly, when the solution is not unique, as will often be the case for large pedigrees, it enables investigation into the properties of maximum <span class="hlt">likelihood</span> pedigree estimates which has not been possible up to now. Crucially, we also have a means of assessing the behaviour of other approximate approaches which all aim to find a maximum <span class="hlt">likelihood</span> solution. Our approach hence allows us to properly address the question of whether a reasonably high <span class="hlt">likelihood</span> solution that is easy to obtain is practically as</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1460159','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1460159"><span>Properties of maximum <span class="hlt">likelihood</span> male fertility estimation in plant populations.</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Morgan, M T</p> <p>1998-01-01</p> <p>Computer simulations are used to evaluate maximum <span class="hlt">likelihood</span> methods for inferring male fertility in plant populations. The maximum <span class="hlt">likelihood</span> method can provide substantial power to characterize male fertilities at the population level. Results emphasize, however, the importance of adequate experimental design and evaluation of fertility estimates, as well as limitations to inference (e.g., about the variance in male fertility or the correlation between fertility and phenotypic trait value) that can be reasonably drawn. PMID:9611217</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JPhA...47c5101B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JPhA...47c5101B"><span>A notion of graph <span class="hlt">likelihood</span> and an infinite monkey theorem</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Banerji, Christopher R. S.; Mansour, Toufik; Severini, Simone</p> <p>2014-01-01</p> <p>We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph <span class="hlt">likelihood</span> as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the <span class="hlt">likelihood</span> as an open problem.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70176422','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70176422"><span>Time series electromagnetic induction-<span class="hlt">log</span> datasets, including <span class="hlt">logs</span> collected through the 2014 water year in south Florida</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Prinos, Scott T.; Valderrama, Robert</p> <p>2016-01-01</p> <p>Time series electromagnetic-induction <span class="hlt">log</span> (TSEMIL) datasets are collected from PVC cased or uncased monitoring wells to evaluate changes in water conductivity over time. TSEMIL datasets consist of a series of individual electromagnetic-induction <span class="hlt">logs</span> collected at a frequency of months or years that have been compiled into a dataset by eliminating small uniform offsets in bulk conductivity between <span class="hlt">logs</span> likely caused by minor variations in calibration. At depths where water conductivity is not changing through time, each <span class="hlt">log</span> is typically within about ±15 mS/m of the median of the dataset at any given depth, which is within the stated repeatability specifications of the probe (Mount Sopris Instrument Co., Inc., 2002). Though the offsets between <span class="hlt">logs</span> are small (±15 mS/m), they obscure the ability to identify small but real changes in bulk conductivity resulting from changes in aquifer salinity. To create TSEMIL datasets, these offsets are removed by selecting a depth at which no changes are apparent from year to year, and by adjusting individual <span class="hlt">logs</span> to the median of all <span class="hlt">logs</span> at the selected depth. Generally, these depths are within the freshwater saturated part of the aquifer, well below the water table. Once the offsets have been removed there is generally only about ±<span class="hlt">2</span> to 3 mS/m of completely irregular variation between successive <span class="hlt">logs</span> that cannot be removed. Even if perfect numerical alignment is achieved at one or two depths, the ±<span class="hlt">2</span> to 3 mS/m of random variation remains at other depths. Given these corrections, however, changes from year to year caused by saltwater intrusion are easier to identify. Detailed descriptions of how these corrections are applied are described in Prinos and others (2014) and Prinos and Valderrama (2015).TSEMIL datasets can be used to monitor changes in water conductivity throughout the full thickness of an aquifer, without the need for long open-interval wells, which have allowed, in some instances, vertical water flow within the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24400873','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24400873"><span>Parametric <span class="hlt">likelihood</span> inference for interval censored competing risks data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hudgens, Michael G; Li, Chenxi; Fine, Jason P</p> <p>2014-03-01</p> <p>Parametric estimation of the cumulative incidence function (CIF) is considered for competing risks data subject to interval censoring. Existing parametric models of the CIF for right censored competing risks data are adapted to the general case of interval censoring. Maximum <span class="hlt">likelihood</span> estimators for the CIF are considered under the assumed models, extending earlier work on nonparametric estimation. A simple naive <span class="hlt">likelihood</span> estimator is also considered that utilizes only part of the observed data. The naive estimator enables separate estimation of models for each cause, unlike full maximum <span class="hlt">likelihood</span> in which all models are fit simultaneously. The naive <span class="hlt">likelihood</span> is shown to be valid under mixed case interval censoring, but not under an independent inspection process model, in contrast with full maximum <span class="hlt">likelihood</span> which is valid under both interval censoring models. In simulations, the naive estimator is shown to perform well and yield comparable efficiency to the full <span class="hlt">likelihood</span> estimator in some settings. The methods are applied to data from a large, recent randomized clinical trial for the prevention of mother-to-child transmission of HIV.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=logistics&pg=4&id=EJ956057','ERIC'); return false;" href="http://eric.ed.gov/?q=logistics&pg=4&id=EJ956057"><span>Comparison of IRT <span class="hlt">Likelihood</span> Ratio Test and Logistic Regression DIF Detection Procedures</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Atar, Burcu; Kamata, Akihito</p> <p>2011-01-01</p> <p>The Type I error rates and the power of IRT <span class="hlt">likelihood</span> ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, <span class="hlt">2</span> sample…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=monte+AND+carlo+AND+3+AND+2&pg=5&id=EJ636367','ERIC'); return false;" href="http://eric.ed.gov/?q=monte+AND+carlo+AND+3+AND+2&pg=5&id=EJ636367"><span>The Relative Performance of Full Information Maximum <span class="hlt">Likelihood</span> Estimation for Missing Data in Structural Equation Models.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Enders, Craig K.; Bandalos, Deborah L.</p> <p>2001-01-01</p> <p>Used Monte Carlo simulation to examine the performance of four missing data methods in structural equation models: (1)full information maximum <span class="hlt">likelihood</span> (FIML); (<span class="hlt">2</span>) listwise deletion; (3) pairwise deletion; and (4) similar response pattern imputation. Results show that FIML estimation is superior across all conditions of the design. (SLD)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/658901','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/658901"><span>Evaluation of historical dry well surveillance <span class="hlt">logs</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Price, R.K.</p> <p>1996-09-09</p> <p>Several dry well surveillance <span class="hlt">logs</span> from 1975 through 1995 for the SX Tank Farm have been examined to identify potential subsurface zones of radioactive contaminant migration. Several dynamic conditions of the gamma-ray emitting radioactive contaminant shave been identified.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17556578','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17556578"><span>Expansion of industrial <span class="hlt">logging</span> in Central Africa.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Laporte, Nadine T; Stabach, Jared A; Grosch, Robert; Lin, Tiffany S; Goetz, Scott J</p> <p>2007-06-08</p> <p>Industrial <span class="hlt">logging</span> has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new <span class="hlt">logging</span> roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of <span class="hlt">logging</span> expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively <span class="hlt">logged</span> areas.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/919898','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/servlets/purl/919898"><span><span class="hlt">Logging</span>-while-coring method and apparatus</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Goldberg, David S.; Myers, Gregory J.</p> <p>2007-11-13</p> <p>A method and apparatus for downhole coring while receiving <span class="hlt">logging</span>-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes <span class="hlt">logging</span>-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of <span class="hlt">logging</span>-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-<span class="hlt">log</span> depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19940034722&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dlog','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19940034722&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dlog"><span>Optimal message <span class="hlt">log</span> reclamation for uncoordinated checkpointing</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wang, Yi-Min; Fuchs, W. K.</p> <p>1994-01-01</p> <p>Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message <span class="hlt">logs</span>. Traditionally, it has been assumed that only obsolete checkpoints and message <span class="hlt">logs</span> before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message <span class="hlt">logs</span> for systems requiring message <span class="hlt">logging</span> to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/1176086','DOE-PATENT-XML'); return false;" href="http://www.osti.gov/scitech/servlets/purl/1176086"><span><span class="hlt">Logging</span>-while-coring method and apparatus</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Goldberg, David S.; Myers, Gregory J.</p> <p>2007-01-30</p> <p>A method and apparatus for downhole coring while receiving <span class="hlt">logging</span>-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes <span class="hlt">logging</span>-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of <span class="hlt">logging</span>-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-<span class="hlt">log</span> depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title32-vol5/pdf/CFR-2012-title32-vol5-sec700-845.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title32-vol5/pdf/CFR-2012-title32-vol5-sec700-845.pdf"><span>32 CFR 700.845 - Maintenance of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 32 National Defense 5 2012-07-01 2012-07-01 false Maintenance of <span class="hlt">logs</span>. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of <span class="hlt">logs</span>. (a) A deck <span class="hlt">log</span> and an engineering <span class="hlt">log</span> shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck <span class="hlt">log</span>. An...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title29-vol1/pdf/CFR-2012-title29-vol1-sec42-7.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title29-vol1/pdf/CFR-2012-title29-vol1-sec42-7.pdf"><span>29 CFR 42.7 - Complaint/directed action <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 29 Labor 1 2012-07-01 2012-07-01 false Complaint/directed action <span class="hlt">logs</span>. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action <span class="hlt">logs</span>. (a) To... operation of a system of coordinated Complaint/Directed Action <span class="hlt">Logs</span> (<span class="hlt">logs</span>). The <span class="hlt">logs</span> shall be maintained...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title29-vol1/pdf/CFR-2014-title29-vol1-sec42-7.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title29-vol1/pdf/CFR-2014-title29-vol1-sec42-7.pdf"><span>29 CFR 42.7 - Complaint/directed action <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 29 Labor 1 2014-07-01 2013-07-01 true Complaint/directed action <span class="hlt">logs</span>. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action <span class="hlt">logs</span>. (a) To... operation of a system of coordinated Complaint/Directed Action <span class="hlt">Logs</span> (<span class="hlt">logs</span>). The <span class="hlt">logs</span> shall be maintained...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title32-vol5/pdf/CFR-2011-title32-vol5-sec700-845.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title32-vol5/pdf/CFR-2011-title32-vol5-sec700-845.pdf"><span>32 CFR 700.845 - Maintenance of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 32 National Defense 5 2011-07-01 2011-07-01 false Maintenance of <span class="hlt">logs</span>. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of <span class="hlt">logs</span>. (a) A deck <span class="hlt">log</span> and an engineering <span class="hlt">log</span> shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck <span class="hlt">log</span>. An...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title32-vol5/pdf/CFR-2013-title32-vol5-sec700-845.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title32-vol5/pdf/CFR-2013-title32-vol5-sec700-845.pdf"><span>32 CFR 700.845 - Maintenance of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 32 National Defense 5 2013-07-01 2013-07-01 false Maintenance of <span class="hlt">logs</span>. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of <span class="hlt">logs</span>. (a) A deck <span class="hlt">log</span> and an engineering <span class="hlt">log</span> shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck <span class="hlt">log</span>. An...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title29-vol1/pdf/CFR-2013-title29-vol1-sec42-7.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title29-vol1/pdf/CFR-2013-title29-vol1-sec42-7.pdf"><span>29 CFR 42.7 - Complaint/directed action <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 29 Labor 1 2013-07-01 2013-07-01 false Complaint/directed action <span class="hlt">logs</span>. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action <span class="hlt">logs</span>. (a) To... operation of a system of coordinated Complaint/Directed Action <span class="hlt">Logs</span> (<span class="hlt">logs</span>). The <span class="hlt">logs</span> shall be maintained...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title29-vol1/pdf/CFR-2011-title29-vol1-sec42-7.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title29-vol1/pdf/CFR-2011-title29-vol1-sec42-7.pdf"><span>29 CFR 42.7 - Complaint/directed action <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 29 Labor 1 2011-07-01 2011-07-01 false Complaint/directed action <span class="hlt">logs</span>. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action <span class="hlt">logs</span>. (a) To... operation of a system of coordinated Complaint/Directed Action <span class="hlt">Logs</span> (<span class="hlt">logs</span>). The <span class="hlt">logs</span> shall be maintained...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=log&pg=5&id=EJ674487','ERIC'); return false;" href="http://eric.ed.gov/?q=log&pg=5&id=EJ674487"><span>Web <span class="hlt">Logs</span> in the English Classroom: More Than Just Chat.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Richardson, Will</p> <p>2003-01-01</p> <p>Details the use and appeal of Web <span class="hlt">logs</span> to enhance classroom discussion and allow for outside involvement in the classroom. Defines a Web <span class="hlt">log</span>, addresses discussing literature in a Web <span class="hlt">log</span>, and describes the author's first attempts at using Web-<span class="hlt">log</span> technology. Presents considerations for using Web <span class="hlt">logs</span> as part of classroom instruction. (SG)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol1/pdf/CFR-2010-title29-vol1-sec42-7.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol1/pdf/CFR-2010-title29-vol1-sec42-7.pdf"><span>29 CFR 42.7 - Complaint/directed action <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action <span class="hlt">logs</span>. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action <span class="hlt">logs</span>. (a) To... operation of a system of coordinated Complaint/Directed Action <span class="hlt">Logs</span> (<span class="hlt">logs</span>). The <span class="hlt">logs</span> shall be maintained...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title32-vol5/pdf/CFR-2014-title32-vol5-sec700-845.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title32-vol5/pdf/CFR-2014-title32-vol5-sec700-845.pdf"><span>32 CFR 700.845 - Maintenance of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 32 National Defense 5 2014-07-01 2014-07-01 false Maintenance of <span class="hlt">logs</span>. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of <span class="hlt">logs</span>. (a) A deck <span class="hlt">log</span> and an engineering <span class="hlt">log</span> shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck <span class="hlt">log</span>. An...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title32-vol5/pdf/CFR-2010-title32-vol5-sec700-845.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title32-vol5/pdf/CFR-2010-title32-vol5-sec700-845.pdf"><span>32 CFR 700.845 - Maintenance of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 32 National Defense 5 2010-07-01 2010-07-01 false Maintenance of <span class="hlt">logs</span>. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of <span class="hlt">logs</span>. (a) A deck <span class="hlt">log</span> and an engineering <span class="hlt">log</span> shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck <span class="hlt">log</span>. An...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010kmis.book..407P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010kmis.book..407P"><span>Conversation Threads Hidden within Email Server <span class="hlt">Logs</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Palus, Sebastian; Kazienko, Przemysław</p> <p></p> <p>Email server <span class="hlt">logs</span> contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email <span class="hlt">logs</span>. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.usgs.gov/circ/1984/0879/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/circ/1984/0879/report.pdf"><span>Computer analysis of digital well <span class="hlt">logs</span></span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Scott, James H.</p> <p>1984-01-01</p> <p>A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well <span class="hlt">logs</span>. The programs are operational on a minicomputer in a research well-<span class="hlt">logging</span> truck, making it possible to analyze and replot the <span class="hlt">logs</span> while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well <span class="hlt">logs</span>. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-<span class="hlt">logging</span> measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-<span class="hlt">log</span> measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24022913','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24022913"><span>Carbon emissions performance of commercial <span class="hlt">logging</span> in East Kalimantan, Indonesia.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Griscom, Bronson; Ellis, Peter; Putz, Francis E</p> <p>2014-03-01</p> <p>Adoption of reduced-impact <span class="hlt">logging</span> (RIL) methods could reduce CO<span class="hlt">2</span> emissions by 30-50% across at least 20% of remaining tropical forests. We developed two cost effective and robust indices for comparing the climate benefits (reduced CO<span class="hlt">2</span> emissions) due to RIL. The indices correct for variability in the volume of commercial timber among concessions. We determined that a correction for variability in terrain slope was not needed. We found that concessions certified by the Forest Stewardship Council (FSC, N = 3), when compared with noncertified concessions (N = 6), did not have lower overall CO<span class="hlt">2</span> emissions from <span class="hlt">logging</span> activity (felling, skidding, and hauling). On the other hand, FSC certified concessions did have lower emissions from one type of <span class="hlt">logging</span> impact (skidding), and we found evidence of a range of improved practices using other field metrics. One explanation of these results may be that FSC criteria and indicators, and associated RIL practices, were not designed to achieve overall emissions reductions. Also, commonly used field metrics are not reliable proxies for overall <span class="hlt">logging</span> emissions performance. Furthermore, the simple distinction between certified and noncertified concessions does not fully represent the complex history of investments in improved <span class="hlt">logging</span> practices. To clarify the relationship between RIL and emissions reductions, we propose the more explicit term 'RIL-C' to refer to the subset of RIL practices that can be defined by quantified thresholds and that result in measurable emissions reductions. If tropical forest certification is to be linked with CO<span class="hlt">2</span> emissions reductions, certification standards need to explicitly require RIL-C practices.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24866455','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24866455"><span>Imagination perspective affects ratings of the <span class="hlt">likelihood</span> of occurrence of autobiographical memories.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Marsh, Benjamin U; Pezdek, Kathy; Lam, Shirley T</p> <p>2014-07-01</p> <p>Two experiments tested and confirmed the hypothesis that when the phenomenological characteristics of imagined events are more similar to those of related autobiographical memories, the imagined event is more likely to be considered to have occurred. At Time 1 and <span class="hlt">2</span>-weeks later, individuals rated the <span class="hlt">likelihood</span> of occurrence for 20 life events. In Experiment 1, 1-week after Time 1, individuals imagined 3 childhood events from a first-person or third-person perspective. There was a no-imagination control. An increase in <span class="hlt">likelihood</span> ratings from Time 1 to Time <span class="hlt">2</span> resulted when imagination was from the third-person but not first-person perspective. In Experiment <span class="hlt">2</span>, childhood and recent events were imagined from a third- or first-person perspective. A significant interaction resulted. For childhood events, <span class="hlt">likelihood</span> change scores were greater for third-person than first-person perspective; for recent adult events, <span class="hlt">likelihood</span> change scores were greater for first-person than third-person perspective, although this latter trend was not significant.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70022501','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70022501"><span>Integrating borehole <span class="hlt">logs</span> and aquifer tests in aquifer characterization</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Paillet, Frederick L.; Reese, R.S.</p> <p>2000-01-01</p> <p>Integration of lithologic <span class="hlt">logs</span>, geophysical <span class="hlt">logs</span>, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of <span class="hlt">logs</span> and aquifer tests in the development of a hydrostratigraphic model for the surficial aquifer system in and around Big Cypress National Preserve in eastern Collier County, Florida. Borehole flowmeter tests provide qualitative permeability profiles in most of 26 boreholes drilled in the Study area. Flow <span class="hlt">logs</span> indicate the depth of transmissive units, which are correlated across the study area. Comparison to published studies in adjacent areas indicates that the main limestone aquifer of the 000000Tamiami Formation in the study area corresponds with the gray limestone aquifer in western Dade County and the water table and lower Tamiami Aquifer in western Collier County. Four strategically located, multiwell aquifer tests are used to quantify the qualitative permeability profiles provided by the flowmeter <span class="hlt">log</span> analysis. The hydrostratigraphic model based on these results defines the main aquifer in the central part of the study area as unconfined to semiconfined with a transmissivity as high as 30,000 m<span class="hlt">2</span>/day. The aquifer decreases in transmissivity to less than 10,000 m<span class="hlt">2</span>/day in some parts of western Collier County, and becomes confined to the east and northeast of the study area, where transmissivity decreases to below 5000 m<span class="hlt">2</span>/day.Integration of lithologic <span class="hlt">logs</span>, geophysical <span class="hlt">logs</span>, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of <span class="hlt">logs</span> and aquifer tests in the development of a hydrostratigraphic model for the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA529468','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA529468"><span>A Comparative Analysis of ASCII and XML <span class="hlt">Logging</span> Systems</span></a></p> <p><a target="_blank" href="https://publicaccess.dtic.mil/psm/api/service/search/search">DTIC Science & Technology</a></p> <p></p> <p>2010-09-01</p> <p>4 <span class="hlt">2.2</span>. EXI vs gzip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 <span class="hlt">2</span>.3. EXI vs ASN.1...can now be placed into a database of its own for searching. The meaning / context of a parcel of information is known since XML’s “markup” is self...and technology easier to build and use. XML <span class="hlt">logging</span> information also becomes extensible, resulting in easier future adaptations. Both of these</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('//www.loc.gov/pictures/collection/hh/item/mt0388.photos.345528p/','SCIGOV-HHH'); return false;" href="//www.loc.gov/pictures/collection/hh/item/mt0388.photos.345528p/"><span>5. <span class="hlt">Log</span> calving barn. Detail of wall corner showing half ...</span></a></p> <p><a target="_blank" href="http://www.loc.gov/pictures/collection/hh/">Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey</a></p> <p></p> <p></p> <p>5. <span class="hlt">Log</span> calving barn. Detail of wall corner showing half dovetail notching on hand-hewn <span class="hlt">logs</span>. - William & Lucina Bowe Ranch, <span class="hlt">Log</span> Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('//www.loc.gov/pictures/collection/hh/item/or0449.photos.355010p/','SCIGOV-HHH'); return false;" href="//www.loc.gov/pictures/collection/hh/item/or0449.photos.355010p/"><span>55. VIEW OF STEAMOPERATED <span class="hlt">LOG</span> HOIST TO PUT IN COMING ...</span></a></p> <p><a target="_blank" href="http://www.loc.gov/pictures/collection/hh/">Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey</a></p> <p></p> <p></p> <p>55. VIEW OF STEAM-OPERATED <span class="hlt">LOG</span> HOIST TO PUT IN COMING <span class="hlt">LOGS</span> INTO RALPH HULL LUMBER CO. <span class="hlt">LOG</span> POND. PHOTOGRAPHER: UNKNOWN. DATE: 1942. COURTESY OF RALPH HULL. - Hull-Oakes Lumber Company, 23837 Dawson Road, Monroe, Benton County, OR</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20007016','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20007016"><span>Lumber quality of Eucalyptus grandis as a function of diametrical position and <span class="hlt">log</span> steaming.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Severo, Elias Taylor Durgante; Calonego, Fred Willians; de Matos, Carlos Alberto Oliveira</p> <p>2010-04-01</p> <p>The objective of this study was to evaluate the effect of <span class="hlt">log</span> steaming and of the diametrical position of boards on the timber quality of Eucalyptus grandis. <span class="hlt">Logs</span> with diameters between 20 and 25 cm, between 25 and 30 cm and between 30 and 35 cm were studied. Half of <span class="hlt">logs</span> were kept in its original condition, and the other half was steamed at 90 degrees C for 20 h. Later, the <span class="hlt">logs</span> were cut into flat saw boards, and defects due to growth stress relief were measured. The results show that: (1) boards from control <span class="hlt">logs</span> show different magnitudes of cracking according to the diameter of the <span class="hlt">log</span> and the diametrical position of the board; (<span class="hlt">2</span>) boards from <span class="hlt">logs</span> with diameters between 30 and 35 cm and those from next to the pith develop larger cracks; and (3) boards from steamed <span class="hlt">logs</span> show a reduction in the magnitude of cracking and a homogenous distribution of this defect relative to diametrical position within the <span class="hlt">log</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/232390','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/232390"><span>Error detection for genetic data, using <span class="hlt">likelihood</span> methods</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Ehm, M.G.; Kimmel, M.; Cottingham, R.W. Jr.</p> <p>1996-01-01</p> <p>As genetic maps become denser, the effect of laboratory typing errors becomes more serious. We review a general method for detecting errors in pedigree genotyping data that is a variant of the <span class="hlt">likelihood</span>-ratio test statistic. It pinpoints individuals and loci with relatively unlikely genotypes. Power and significance studies using Monte Carlo methods are shown by using simulated data with pedigree structures similar to the CEPH pedigrees and a larger experimental pedigree used in the study of idiopathic dilated cardiomyopathy (DCM). The studies show the index detects errors for small values of {theta} with high power and an acceptable false positive rate. The method was also used to check for errors in DCM laboratory pedigree data and to estimate the error rate in CEPH chromosome 6 data. The errors flagged by our method in the DCM pedigree were confirmed by the laboratory. The results are consistent with estimated false-positive and false-negative rates obtained using simulation. 21 refs., 5 figs., <span class="hlt">2</span> tabs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/21513091','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/21513091"><span>Mapping gravitational lensing of the CMB using local <span class="hlt">likelihoods</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Anderes, Ethan; Knox, Lloyd; Engelen, Alexander van</p> <p>2011-02-15</p> <p>We present a new estimation method for mapping the gravitational lensing potential from observed CMB intensity and polarization fields. Our method uses Bayesian techniques to estimate the average curvature of the potential over small local regions. These local curvatures are then used to construct an estimate of a low pass filter of the gravitational potential. By utilizing Bayesian/<span class="hlt">likelihood</span> methods one can easily overcome problems with missing and/or nonuniform pixels and problems with partial sky observations (E- and B-mode mixing, for example). Moreover, our methods are local in nature, which allow us to easily model spatially varying beams, and are highly parallelizable. We note that our estimates do not rely on the typical Taylor approximation which is used to construct estimates of the gravitational potential by Fourier coupling. We present our methodology with a flat sky simulation under nearly ideal experimental conditions with a noise level of 1 {mu}K-arcmin for the temperature field, {radical}(<span class="hlt">2</span>) {mu}K-arcmin for the polarization fields, with an instrumental beam full width at half maximum (FWHM) of 0.25 arcmin.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2000AAS...196.5401P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2000AAS...196.5401P"><span>Detecting Multiple Model Components with the <span class="hlt">Likelihood</span> Ratio Test</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Protassov, R. S.; van Dyk, D. A.</p> <p>2000-05-01</p> <p>The <span class="hlt">likelihood</span> ratio test (LRT) and F-test popularized in astrophysics by Bevington (Data Reduction and Error Analysis in the Physical Sciences ) and Cash (1977, ApJ 228, 939), do not (even asymptotically) adhere to their nominal χ<span class="hlt">2</span> and F distributions in many statistical tests commonly used in astrophysics. The many legitimate uses of the LRT (see, e.g., the examples given in Cash (1977)) notwithstanding, it can be impossible to compute the false positive rate of the LRT or related tests such as the F-test. For example, although Cash (1977) did not suggest the LRT for detecting a line profile in a spectral model, it has become common practice despite the lack of certain required mathematical regularity conditions. Contrary to common practice, the nominal distribution of the LRT statistic should not be used in these situations. In this paper, we characterize an important class of problems where the LRT fails, show the non-standard behavior of the test in this setting, and provide a Bayesian alternative to the LRT, i.e., posterior predictive p-values. We emphasize that there are many legitimate uses of the LRT in astrophysics, and even when the LRT is inappropriate, there remain several statistical alternatives (e.g., judicious use of error bars and Bayes factors). We illustrate this point in our analysis of GRB 970508 that was studied by Piro et al. in ApJ, 514:L73-L77, 1999.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19606590','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19606590"><span>Stable isotope analysis of white paints and <span class="hlt">likelihood</span> ratios.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Farmer, N; Meier-Augenstein, W; Lucy, D</p> <p>2009-06-01</p> <p>Architectural paints are commonly found as trace evidence at scenes of crime. Currently the most widely used technique for the analysis of architectural paints is Fourier Transformed Infra-Red Spectroscopy (FTIR). There are, however, limitations to the forensic analysis of white paints, and the ability to discriminate between samples. Isotope ratio mass spectrometry (IRMS) has been investigated as a potential tool for the analysis of architectural white paints, where no preparation of samples prior to analysis is required. When stable isotope profiles (SIPs) are compared, there appears to be no relationship between paints from the same manufacturer, or between paints of the same type. Unlike existing techniques, IRMS does not differentiate resin samples solely on the basis of modifier or oil-type, but exploits additional factors linked to samples such as geo-location where oils added to alkyd formulations were grown. In combination with the use of <span class="hlt">likelihood</span> ratios, IRMS shows potential, with a false positive rate of <span class="hlt">2</span>.6% from a total of 1275 comparisons.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70157170','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70157170"><span>Well <span class="hlt">log</span> characterization of natural gas hydrates</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Collett, Timothy S.; Lee, Myung W.</p> <p>2011-01-01</p> <p>In the last 25 years we have seen significant advancements in the use of downhole well <span class="hlt">logging</span> tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic <span class="hlt">logs</span> to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced <span class="hlt">logging</span>-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole <span class="hlt">log</span> data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole <span class="hlt">logging</span> tools designed to make directionally oriented acoustic and propagation resistivity <span class="hlt">log</span> measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) <span class="hlt">logging</span> and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/881436','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/881436"><span>Fluid-Rock Characterization and Interactions in NMR Well <span class="hlt">Logging</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>George J. Hirasaki; Kishore K. Mohanty</p> <p>2005-09-05</p> <p>The objective of this report is to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well <span class="hlt">logging</span>. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well <span class="hlt">logging</span>. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity. Oil based drilling fluids can have an adverse effect on NMR well <span class="hlt">logging</span> if it alters the wettability of the formation. The effect of various surfactants on wettability and surface relaxivity are evaluated for silica sand. The relation between the relaxation time and diffusivity distinguishes the response of brine, oil, and gas in a NMR well <span class="hlt">log</span>. A new NMR pulse sequence in the presence of a field gradient and a new inversion technique enables the T{sub <span class="hlt">2</span>} and diffusivity distributions to be displayed as a two-dimensional map. The objectives of pore morphology and rock characterization are to identify vug connectivity by using X-ray CT scan, and to improve NMR permeability correlation. Improved estimation of permeability from NMR response is possible by using estimated tortuosity as a parameter to interpolate between two existing permeability models.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4788249','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4788249"><span>Efficient Strategies for Calculating Blockwise <span class="hlt">Likelihoods</span> Under the Coalescent</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lohse, Konrad; Chmelik, Martin; Martin, Simon H.; Barton, Nicholas H.</p> <p>2016-01-01</p> <p>The inference of demographic history from genome data is hindered by a lack of efficient computational approaches. In particular, it has proved difficult to exploit the information contained in the distribution of genealogies across the genome. We have previously shown that the generating function (GF) of genealogies can be used to analytically compute <span class="hlt">likelihoods</span> of demographic models from configurations of mutations in short sequence blocks (Lohse et al. 2011). Although the GF has a simple, recursive form, the size of such <span class="hlt">likelihood</span> calculations explodes quickly with the number of individuals and applications of this framework have so far been mainly limited to small samples (pairs and triplets) for which the GF can be written by hand. Here we investigate several strategies for exploiting the inherent symmetries of the coalescent. In particular, we show that the GF of genealogies can be decomposed into a set of equivalence classes that allows <span class="hlt">likelihood</span> calculations from nontrivial samples. Using this strategy, we automated blockwise <span class="hlt">likelihood</span> calculations for a general set of demographic scenarios in Mathematica. These histories may involve population size changes, continuous migration, discrete divergence, and admixture between multiple populations. To give a concrete example, we calculate the <span class="hlt">likelihood</span> for a model of isolation with migration (IM), assuming two diploid samples without phase and outgroup information. We demonstrate the new inference scheme with an analysis of two individual butterfly genomes from the sister species Heliconius melpomene rosina and H. cydno. PMID:26715666</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/20774490','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/20774490"><span>Constraint <span class="hlt">likelihood</span> analysis for a network of gravitational wave detectors</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.</p> <p>2005-12-15</p> <p>We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the <span class="hlt">likelihood</span> ratio functional for unknown signal waveforms. In the <span class="hlt">likelihood</span> analysis, the global maximum of the <span class="hlt">likelihood</span> ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the <span class="hlt">likelihood</span> functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard <span class="hlt">likelihood</span> method.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70029504','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70029504"><span><span class="hlt">Log</span>Cauchy, <span class="hlt">log</span>-sech and lognormal distributions of species abundances in forest communities</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Yin, Z.-Y.; Peng, S.-L.; Ren, H.; Guo, Q.; Chen, Z.-H.</p> <p>2005-01-01</p> <p>Species-abundance (SA) pattern is one of the most fundamental aspects of biological community structure, providing important information regarding species richness, species-area relation and succession. To better describe the SA distribution (SAD) in a community, based on the widely used lognormal (LN) distribution model with exp(-x<span class="hlt">2</span>) roll-off on Preston's octave scale, this study proposed two additional models, <span class="hlt">log</span>Cauchy (LC) and <span class="hlt">log</span>-sech (LS), respectively with roll-offs of simple x-<span class="hlt">2</span> and e-x. The estimation of the theoretical total number of species in the whole community, S*, including very rare species not yet collected in sample, was derived from the left-truncation of each distribution. We fitted these three models by Levenberg-Marquardt nonlinear regression and measured the model fit to the data using coefficient of determination of regression, parameters' t-test and distribution's Kolmogorov-Smirnov (KS) test. Examining the SA data from six forest communities (five in lower subtropics and one in tropics), we found that: (1) on a <span class="hlt">log</span> scale, all three models that are bell-shaped and left-truncated statistically adequately fitted the observed SADs, and the LC and LS did better than the LN; (<span class="hlt">2</span>) from each model and for each community the S* values estimated by the integral and summation methods were almost equal, allowing us to estimate S* using a simple integral formula and to estimate its asymptotic confidence internals by regression of a transformed model containing it; (3) following the order of LC, LS, and LN, the fitted distributions became lower in the peak, less concave in the side, and shorter in the tail, and overall the LC tended to overestimate, the LN tended to underestimate, while the LS was intermediate but slightly tended to underestimate, the observed SADs (particularly the number of common species in the right tail); (4) the six communities had some similar structural properties such as following similar distribution models, having a common</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title16-vol2/pdf/CFR-2014-title16-vol2-part1209-subpartA-app9.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title16-vol2/pdf/CFR-2014-title16-vol2-part1209-subpartA-app9.pdf"><span>16 CFR Figure 9 to Subpart A of... - Flux Profile Data <span class="hlt">Log</span> Format</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-01-01</p> <p>... 16 Commercial Practices <span class="hlt">2</span> 2014-01-01 2014-01-01 false Flux Profile Data <span class="hlt">Log</span> Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data <span class="hlt">Log</span> Format EC03OC91.039...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title16-vol2/pdf/CFR-2013-title16-vol2-part1209-subpartA-app9.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title16-vol2/pdf/CFR-2013-title16-vol2-part1209-subpartA-app9.pdf"><span>16 CFR Figure 9 to Subpart A of... - Flux Profile Data <span class="hlt">Log</span> Format</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-01-01</p> <p>... 16 Commercial Practices <span class="hlt">2</span> 2013-01-01 2013-01-01 false Flux Profile Data <span class="hlt">Log</span> Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data <span class="hlt">Log</span> Format EC03OC91.039...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title16-vol2/pdf/CFR-2011-title16-vol2-part1209-subpartA-app9.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title16-vol2/pdf/CFR-2011-title16-vol2-part1209-subpartA-app9.pdf"><span>16 CFR Figure 9 to Subpart A of... - Flux Profile Data <span class="hlt">Log</span> Format</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-01-01</p> <p>... 16 Commercial Practices <span class="hlt">2</span> 2011-01-01 2011-01-01 false Flux Profile Data <span class="hlt">Log</span> Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data <span class="hlt">Log</span> Format EC03OC91.039...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title16-vol2/pdf/CFR-2010-title16-vol2-part1209-subpartA-app9.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title16-vol2/pdf/CFR-2010-title16-vol2-part1209-subpartA-app9.pdf"><span>16 CFR Figure 9 to Subpart A of... - Flux Profile Data <span class="hlt">Log</span> Format</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-01-01</p> <p>... 16 Commercial Practices <span class="hlt">2</span> 2010-01-01 2010-01-01 false Flux Profile Data <span class="hlt">Log</span> Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data <span class="hlt">Log</span> Format EC03OC91.039...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title16-vol2/pdf/CFR-2012-title16-vol2-part1209-subpartA-app9.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title16-vol2/pdf/CFR-2012-title16-vol2-part1209-subpartA-app9.pdf"><span>16 CFR Figure 9 to Subpart A of... - Flux Profile Data <span class="hlt">Log</span> Format</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-01-01</p> <p>... 16 Commercial Practices <span class="hlt">2</span> 2012-01-01 2012-01-01 false Flux Profile Data <span class="hlt">Log</span> Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data <span class="hlt">Log</span> Format EC03OC91.039...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NatSR...619831M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NatSR...619831M"><span>The <span class="hlt">Likelihood</span> of Recent Record Warmth</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mann, Michael E.; Rahmstorf, Stefan; Steinman, Byron A.; Tingley, Martin; Miller, Sonya K.</p> <p>2016-01-01</p> <p>2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,<span class="hlt">2</span>. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140017626','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140017626"><span>The Atacama Cosmology Telescope: <span class="hlt">Likelihood</span> for Small-Scale CMB Data</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G. E.; Battaglia, N.; Battistelli, E. S.; Bond, J. R.; Das, S.; Devlin, M. J.; Dunner, R.; Fowler, J. W.; Gralla, M.; Hajian, A.; Halpern, M.; Hasselfield, M.; Hincks, A. D.; Hlozek, R.; Hughes, J. P.; Irwin, K. D.; Kosowsky, A.; Louis, T.; Marriage, T. A.; Marsden, D.; Menanteau, F.; Niemack, M.</p> <p>2013-01-01</p> <p>The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a <span class="hlt">likelihood</span> function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the <span class="hlt">likelihood</span> to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with ?<span class="hlt">2</span>/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency <span class="hlt">likelihood</span> to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' <span class="hlt">likelihood</span> in the range 500 < l < 3500 for use in cosmological parameter estimation</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21057975','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21057975"><span>Genetic testing <span class="hlt">likelihood</span>: the impact of abortion views and quality of life information on women's decisions.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wilson, Jessica L; Ferguson, Gail M; Thorn, Judith M</p> <p>2011-04-01</p> <p>Little is known about factors predicting the <span class="hlt">likelihood</span> of choosing genetic testing in college aged women versus older women, including knowledge of quality of life (QOL) associated with a disorder. Using vignettes with female college students (Experiment 1: n=257, mean age=19.70 yrs) and female faculty/staff/alumni (Experiment <span class="hlt">2</span>: n (nulliparous)=83, mean age=30.20 yrs; n (mothers)=53, mean age=33.77 yrs), we examined the contribution of multiple factors to predicting genetic testing <span class="hlt">likelihood</span> for cystic fibrosis. We investigated malleable situational factors (style of genetic risk presentation and providing QOL information including physical and social aspects) and stable dispositional factors (abortion views). Parity (i.e., prior births) was more influential in women's genetic testing <span class="hlt">likelihood</span> than was age. Greater acceptability of abortion for oneself and self-assessed knowledge following QOL information were predictors of higher testing <span class="hlt">likelihood</span> for college students. Greater acceptability of abortion for another person was a predictor for nulliparous women. Abortion views moderated the effect of predictors for nulliparous women and mothers. Findings encourage genetic counselors to utilize QOL information to promote informed decision making through genetic testing.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/22282804','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/22282804"><span>The Atacama Cosmology Telescope: <span class="hlt">likelihood</span> for small-scale CMB data</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G.E.; Halpern, M.; Battaglia, N.; Battistelli, E.S.; Bond, J.R.; Hajian, A.; Hincks, A.D.; Das, S.; Devlin, M.J.; Dünner, R.; Fowler, J.W.; Irwin, K.D.; Gralla, M.; Hasselfield, M.; Hlozek, R.; Hughes, J.P.; Kosowsky, A.; and others</p> <p>2013-07-01</p> <p>The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a <span class="hlt">likelihood</span> function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the <span class="hlt">likelihood</span> to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with χ{sup <span class="hlt">2</span>}/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency <span class="hlt">likelihood</span> to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' <span class="hlt">likelihood</span> in the range 500 < l < 3500 for use in cosmological parameter estimation.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11686443','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11686443"><span><span class="hlt">Likelihood</span> maximization for list-mode emission tomographic image reconstruction.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Byrne, C</p> <p>2001-10-01</p> <p>The maximum a posteriori (MAP) Bayesian iterative algorithm using priors that are gamma distributed, due to Lange, Bahn and Little, is extended to include parameter choices that fall outside the gamma distribution model. Special cases of the resulting iterative method include the expectation maximization maximum <span class="hlt">likelihood</span> (EMML) method based on the Poisson model in emission tomography, as well as algorithms obtained by Parra and Barrett and by Huesman et al. that converge to maximum <span class="hlt">likelihood</span> and maximum conditional <span class="hlt">likelihood</span> estimates of radionuclide intensities for list-mode emission tomography. The approach taken here is optimization-theoretic and does not rely on the usual expectation maximization (EM) formalism. Block-iterative variants of the algorithms are presented. A self-contained, elementary proof of convergence of the algorithm is included.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20050185194&hterms=mathematical+functions&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dmathematical%2Bfunctions','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20050185194&hterms=mathematical+functions&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Dmathematical%2Bfunctions"><span>Function Based Risk Assessment: Mapping Function to <span class="hlt">Likelihood</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Tumer, Irem Y.; Grantham, Katie; Stone, Robert</p> <p>2005-01-01</p> <p>The concept of function offers a high potential for thinking and reasoning about designs as well as providing a common thread for relating together other design information. This paper focuses specifically on the relation between function and risk by presenting a mathematical mapping from product function to risk <span class="hlt">likelihood</span>. This risk information is composed of design parameters, failure modes, and <span class="hlt">likelihood</span> values. A spacecraft orientation subsystem, subsystem used to guide science instruments, and a Bell 206 rotorcraft are used to test the mapping which continues research on these products relating function to failure. Finally, a case study is presented in which the risk element <span class="hlt">likelihood</span> is calculated for a fuel cell which is in the conceptual design phase at NASA JPL.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3266747','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3266747"><span>Non-Concave Penalized <span class="hlt">Likelihood</span> with NP-Dimensionality</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Fan, Jianqing; Lv, Jinchi</p> <p>2011-01-01</p> <p>Penalized <span class="hlt">likelihood</span> methods are fundamental to ultra-high dimensional variable selection. How high dimensionality such methods can handle remains largely unknown. In this paper, we show that in the context of generalized linear models, such methods possess model selection consistency with oracle properties even for dimensionality of Non-Polynomial (NP) order of sample size, for a class of penalized <span class="hlt">likelihood</span> approaches using folded-concave penalty functions, which were introduced to ameliorate the bias problems of convex penalty functions. This fills a long-standing gap in the literature where the dimensionality is allowed to grow slowly with the sample size. Our results are also applicable to penalized <span class="hlt">likelihood</span> with the L1-penalty, which is a convex function at the boundary of the class of folded-concave penalty functions under consideration. The coordinate optimization is implemented for finding the solution paths, whose performance is evaluated by a few simulation examples and the real data analysis. PMID:22287795</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25698513','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25698513"><span>Measuring coherence of computer-assisted <span class="hlt">likelihood</span> ratio methods.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Haraksim, Rudolf; Ramos, Daniel; Meuwly, Didier; Berger, Charles E H</p> <p>2015-04-01</p> <p>Measuring the performance of forensic evaluation methods that compute <span class="hlt">likelihood</span> ratios (LRs) is relevant for both the development and the validation of such methods. A framework of performance characteristics categorized as primary and secondary is introduced in this study to help achieve such development and validation. Ground-truth labelled fingerprint data is used to assess the performance of an example <span class="hlt">likelihood</span> ratio method in terms of those performance characteristics. Discrimination, calibration, and especially the coherence of this LR method are assessed as a function of the quantity and quality of the trace fingerprint specimen. Assessment of the coherence revealed a weakness of the comparison algorithm in the computer-assisted <span class="hlt">likelihood</span> ratio method used.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1682731','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1682731"><span><span class="hlt">Likelihoods</span> of multilocus DNA fingerprints in extended families.</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bockel, B; Nürnberg, P; Krawczak, M</p> <p>1992-01-01</p> <p>A concept for the application of complex pedigree analysis to multilocus DNA fingerprinting is described. By following this approach, the extent to which the DNA fingerprints of grandparents influence the phenotype <span class="hlt">likelihoods</span> of their offspring was determined. It was demonstrated by simulation that approximately 90% of paternity disputes can be solved if mother, child, and paternal grandparents, instead of the putative father, are tested. If only phenotype information on a single paternal sib is allowed for, true paternity will be detected with reasonable persuasive power in up to 64% of cases. Exclusion of false paternity remains possible for 40% of cases. Finally, the analysis concept is modified by reducing the number of genotype variations considered in <span class="hlt">likelihood</span> computations. This time-saving procedure is shown to yield sufficiently accurate <span class="hlt">likelihoods</span> in the analysis of both simulation data and multilocus DNA fingerprints obtained in two large families. PMID:1496987</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.B13B0627B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.B13B0627B"><span>Nuclear Well <span class="hlt">Log</span> Properties of Natural Gas Hydrate Reservoirs</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Burchwell, A.; Cook, A.</p> <p>2015-12-01</p> <p>Characterizing gas hydrate in a reservoir typically involves a full suite of geophysical well <span class="hlt">logs</span>. The most common method involves using resistivity measurements to quantify the decrease in electrically conductive water when replaced with gas hydrate. Compressional velocity measurements are also used because the gas hydrate significantly strengthens the moduli of the sediment. At many gas hydrate sites, nuclear well <span class="hlt">logs</span>, which include the photoelectric effect, formation sigma, carbon/oxygen ratio and neutron porosity, are also collected but often not used. In fact, the nuclear response of a gas hydrate reservoir is not known. In this research we will focus on the nuclear <span class="hlt">log</span> response in gas hydrate reservoirs at the Mallik Field at the Mackenzie Delta, Northwest Territories, Canada, and the Gas Hydrate Joint Industry Project Leg <span class="hlt">2</span> sites in the northern Gulf of Mexico. Nuclear <span class="hlt">logs</span> may add increased robustness to the investigation into the properties of gas hydrates and some types of <span class="hlt">logs</span> may offer an opportunity to distinguish between gas hydrate and permafrost. For example, a true formation sigma <span class="hlt">log</span> measures the thermal neutron capture cross section of a formation and pore constituents; it is especially sensitive to hydrogen and chlorine in the pore space. Chlorine has a high absorption potential, and is used to determine the amount of saline water within pore spaces. Gas hydrate offers a difference in elemental composition compared to water-saturated intervals. Thus, in permafrost areas, the carbon/oxygen ratio may vary between gas hydrate and permafrost, due to the increase of carbon in gas hydrate accumulations. At the Mallik site, we observe a hydrate-bearing sand (1085-1107 m) above a water-bearing sand (1107-1140 m), which was confirmed through core samples and mud gas analysis. We observe a decrease in the photoelectric absorption of ~0.5 barnes/e-, as well as an increase in the formation sigma readings of ~5 capture units in the water-bearing sand as</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ERL.....9c4017P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ERL.....9c4017P"><span>Carbon emissions from tropical forest degradation caused by <span class="hlt">logging</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pearson, Timothy R. H.; Brown, Sandra; Casarim, Felipe M.</p> <p>2014-03-01</p> <p>The focus of land-use related efforts in developing countries to reduce carbon emissions has been on slowing deforestation, yet international agreements are to reduce emissions from both deforestation and forest degradation (REDD). The second ‘D’ is poorly understood and accounted for a number of technical and policy reasons. Here we introduce a complete accounting method for estimating emission factors from selective timber harvesting, a substantial form of forest degradation in many tropical developing countries. The method accounts separately for emissions from the extracted <span class="hlt">log</span>, from incidental damage to the surrounding forest, and from <span class="hlt">logging</span> infrastructure, and emissions are expressed as units of carbon per cubic meter of timber extracted to allow for simple application to timber harvesting statistics. We applied the method in six tropical countries (Belize, Bolivia, Brazil, Guyana, Indonesia, and Republic of Congo), resulting in total emission factors of 0.99-<span class="hlt">2</span>.33 Mg C m-3. In all cases, emissions were dominated by damage to surrounding vegetation and the infrastructure rather than the <span class="hlt">logs</span> themselves, and total emissions represented about 3-15% of the biomass carbon stocks of the associated unlogged forests. We then combined the emission factors with country level <span class="hlt">logging</span> statistics for nine key timber producing countries represented by our study areas to gain an understanding of the order of magnitude of emissions from degradation compared to those recently reported for deforestation in the same countries. For the nine countries included, emissions from <span class="hlt">logging</span> were on average equivalent to about 12% of those from deforestation. For those nine countries with relatively low emissions from deforestation, emissions from <span class="hlt">logging</span> were equivalent to half or more of those from deforestation, whereas for those countries with the highest emissions from deforestation, emissions from <span class="hlt">logging</span> were equivalent to <10% of those from deforestation. Understanding how</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25974541','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25974541"><span>Growing optimal scale-free networks via <span class="hlt">likelihood</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Small, Michael; Li, Yingying; Stemler, Thomas; Judd, Kevin</p> <p>2015-04-01</p> <p>Preferential attachment, by which new nodes attach to existing nodes with probability proportional to the existing nodes' degree, has become the standard growth model for scale-free networks, where the asymptotic probability of a node having degree k is proportional to k^{-γ}. However, the motivation for this model is entirely ad hoc. We use exact <span class="hlt">likelihood</span> arguments and show that the optimal way to build a scale-free network is to attach most new links to nodes of low degree. Curiously, this leads to a scale-free network with a single dominant hub: a starlike structure we call a superstar network. Asymptotically, the optimal strategy is to attach each new node to one of the nodes of degree k with probability proportional to 1/N+ζ(γ)(k+1)(γ) (in a N node network): a stronger bias toward high degree nodes than exhibited by standard preferential attachment. Our algorithm generates optimally scale-free networks (the superstar networks) as well as randomly sampling the space of all scale-free networks with a given degree exponent γ. We generate viable realization with finite N for 1≪γ<<span class="hlt">2</span> as well as γ><span class="hlt">2</span>. We observe an apparently discontinuous transition at γ≈<span class="hlt">2</span> between so-called superstar networks and more treelike realizations. Gradually increasing γ further leads to reemergence of a superstar hub. To quantify these structural features, we derive a new analytic expression for the expected degree exponent of a pure preferential attachment process and introduce alternative measures of network entropy. Our approach is generic and can also be applied to an arbitrary degree distribution.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4890128','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4890128"><span>Nonparametric maximum <span class="hlt">likelihood</span> estimation for the multisample Wicksell corpuscle problem</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chan, Kwun Chuen Gary; Qin, Jing</p> <p>2016-01-01</p> <p>We study nonparametric maximum <span class="hlt">likelihood</span> estimation for the distribution of spherical radii using samples containing a mixture of one-dimensional, two-dimensional biased and three-dimensional unbiased observations. Since direct maximization of the <span class="hlt">likelihood</span> function is intractable, we propose an expectation-maximization algorithm for implementing the estimator, which handles an indirect measurement problem and a sampling bias problem separately in the E- and M-steps, and circumvents the need to solve an Abel-type integral equation, which creates numerical instability in the one-sample problem. Extensions to ellipsoids are studied and connections to multiplicative censoring are discussed. PMID:27279657</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12071420','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12071420"><span>Semiparametric maximum <span class="hlt">likelihood</span> for nonlinear regression with measurement errors.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Suh, Eun-Young; Schafer, Daniel W</p> <p>2002-06-01</p> <p>This article demonstrates semiparametric maximum <span class="hlt">likelihood</span> estimation of a nonlinear growth model for fish lengths using imprecisely measured ages. Data on the species corvina reina, found in the Gulf of Nicoya, Costa Rica, consist of lengths and imprecise ages for 168 fish and precise ages for a subset of 16 fish. The statistical problem may therefore be classified as nonlinear errors-in-variables regression with internal validation data. Inferential techniques are based on ideas extracted from several previous works on semiparametric maximum <span class="hlt">likelihood</span> for errors-in-variables problems. The illustration of the example clarifies practical aspects of the associated computational, inferential, and data analytic techniques.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19930054219&hterms=CPM&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DCPM','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19930054219&hterms=CPM&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3DCPM"><span>Maximum-<span class="hlt">likelihood</span> block detection of noncoherent continuous phase modulation</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Simon, Marvin K.; Divsalar, Dariush</p> <p>1993-01-01</p> <p>This paper examines maximum-<span class="hlt">likelihood</span> block detection of uncoded full response CPM over an additive white Gaussian noise (AWGN) channel. Both the maximum-<span class="hlt">likelihood</span> metrics and the bit error probability performances of the associated detection algorithms are considered. The special and popular case of minimum-shift-keying (MSK) corresponding to h = 0.5 and constant amplitude frequency pulse is treated separately. The many new receiver structures that result from this investigation can be compared to the traditional ones that have been used in the past both from the standpoint of simplicity of implementation and optimality of performance.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2002SPIE.4730..192T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2002SPIE.4730..192T"><span>Comparison of induced rules based on <span class="hlt">likelihood</span> estimation</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tsumoto, Shusaku</p> <p>2002-03-01</p> <p>Rule induction methods have been applied to knowledge discovery in databases and data mining, The empirical results obtained show that they are very powerful and that important knowledge has been extracted from datasets. However, comparison and evaluation of rules are based not on statistical evidence but on rather naive indices, such as conditional probabilities and functions of conditional probabilities. In this paper, we introduce two approaches to induced statistical comparison of induced rules. For the statistical evaluation, <span class="hlt">likelihood</span> ratio test and Fisher's exact test play an important role: <span class="hlt">likelihood</span> ratio statistic measures statistical information about an information table and it is used to measure the difference between two tables.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19790051619&hterms=wachter&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAuthor-Name%26N%3D0%26No%3D20%26Ntt%3Dwachter','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19790051619&hterms=wachter&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAuthor-Name%26N%3D0%26No%3D20%26Ntt%3Dwachter"><span>Parameter estimation in X-ray astronomy using maximum <span class="hlt">likelihood</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wachter, K.; Leach, R.; Kellogg, E.</p> <p>1979-01-01</p> <p>Methods of estimation of parameter values and confidence regions by maximum <span class="hlt">likelihood</span> and Fisher efficient scores starting from Poisson probabilities are developed for the nonlinear spectral functions commonly encountered in X-ray astronomy. It is argued that these methods offer significant advantages over the commonly used alternatives called minimum chi-squared because they rely on less pervasive statistical approximations and so may be expected to remain valid for data of poorer quality. Extensive numerical simulations of the maximum <span class="hlt">likelihood</span> method are reported which verify that the best-fit parameter value and confidence region calculations are correct over a wide range of input spectra.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013NIMPA.723...24F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013NIMPA.723...24F"><span>Unconventional neutron sources for oil well <span class="hlt">logging</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Frankle, C. M.; Dale, G. E.</p> <p>2013-09-01</p> <p>Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well <span class="hlt">logging</span> purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well <span class="hlt">logging</span> tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well <span class="hlt">logging</span> tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70022187','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70022187"><span>Spreadsheet <span class="hlt">log</span> analysis in subsurface geology</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Doveton, J.H.</p> <p>2000-01-01</p> <p>Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline <span class="hlt">logs</span> run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that <span class="hlt">log</span> traces now can be transformed to estimates of rock composition. Increasingly, <span class="hlt">logs</span> are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/11388','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/11388"><span>Lithologic <span class="hlt">logs</span> and geophysical <span class="hlt">logs</span> from test drilling in Palm Beach County, Florida, since 1974</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Swayze, Leo J.; McGovern, Michael C.; Fischer, John N.</p> <p>1980-01-01</p> <p>Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic <span class="hlt">logs</span> from 46 test wells and geophysical <span class="hlt">logs</span> from 40 test wells are contained in this report. (USGS)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=logging&pg=7&id=EJ656382','ERIC'); return false;" href="http://eric.ed.gov/?q=logging&pg=7&id=EJ656382"><span><span class="hlt">Log</span>Safe and Smart: Minnesota OSHA's <span class="hlt">Log</span>Safe Program Takes Root.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Honerman, James</p> <p>1999-01-01</p> <p><span class="hlt">Logging</span> is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the <span class="hlt">logging</span> industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/19850025605','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19850025605"><span>Relationships between <span class="hlt">log</span> N-<span class="hlt">log</span> S and celestial distribution of gamma-ray bursts</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Nishimura, J.; Yamagami, T.</p> <p>1985-01-01</p> <p>The apparent conflict between <span class="hlt">log</span> N-<span class="hlt">log</span> S curve and isotropic celestial distribution of the gamma ray bursts is discussed. A possible selection effect due to the time profile of each burst is examined. It is shown that the contradiction is due to this selection effect of the gamma ray bursts.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25832251','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25832251"><span>Development of pulsed neutron uranium <span class="hlt">logging</span> instrument.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wang, Xin-guang; Liu, Dan; Zhang, Feng</p> <p>2015-03-01</p> <p>This article introduces a development of pulsed neutron uranium <span class="hlt">logging</span> instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of (235)U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The <span class="hlt">logging</span> response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/22392420','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/22392420"><span>Development of pulsed neutron uranium <span class="hlt">logging</span> instrument</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Wang, Xin-guang; Liu, Dan; Zhang, Feng</p> <p>2015-03-15</p> <p>This article introduces a development of pulsed neutron uranium <span class="hlt">logging</span> instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The <span class="hlt">logging</span> response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/20013431','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/20013431"><span>Compacting a Kentucky coal for quality <span class="hlt">logs</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Lin, Y.; Li, Z.; Mao, S.</p> <p>1999-07-01</p> <p>A Kentucky coal was found more difficult to be compacted into large size strong <span class="hlt">logs</span>. Study showed that compaction parameters affecting the strength of compacted coal <span class="hlt">logs</span> could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.usgs.gov/of/2007/1142/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/2007/1142/"><span><span class="hlt">Log</span> ASCII Standard (LAS) Files for Geophysical Wireline Well <span class="hlt">Logs</span> and Their Application to Geologic Cross Sections Through the Central Appalachian Basin</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Crangle, Robert D.</p> <p>2007-01-01</p> <p>Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well <span class="hlt">logs</span> for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well <span class="hlt">logs</span> are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well <span class="hlt">log</span> traces and lithologic <span class="hlt">logs</span> were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well <span class="hlt">logs</span> and converted to a <span class="hlt">Log</span> ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper <span class="hlt">logs</span> to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table <span class="hlt">2</span>).</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27293697','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27293697"><span>Selective <span class="hlt">logging</span>: does the imprint remain on tree structure and composition after 45 years?</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Osazuwa-Peters, Oyomoare L; Chapman, Colin A; Zanne, Amy E</p> <p>2015-01-01</p> <p>Selective <span class="hlt">logging</span> of tropical forests is increasing in extent and intensity. The duration over which impacts of selective <span class="hlt">logging</span> persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective <span class="hlt">logging</span> event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.<span class="hlt">2</span> ha area in Kibale National Park, Uganda, in <span class="hlt">logged</span> and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively <span class="hlt">logged</span> forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively <span class="hlt">logged</span> forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of <span class="hlt">logging</span> history may be due to functional recovery of <span class="hlt">logged</span> forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective <span class="hlt">logging</span> on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in <span class="hlt">logged</span> forest compared with unlogged forest. The effects of selective <span class="hlt">logging</span> are exerted via influences on tree demography rather than functional trait composition.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4778436','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4778436"><span>Selective <span class="hlt">logging</span>: does the imprint remain on tree structure and composition after 45 years?</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Osazuwa-Peters, Oyomoare L.; Chapman, Colin A.; Zanne, Amy E.</p> <p>2015-01-01</p> <p>Selective <span class="hlt">logging</span> of tropical forests is increasing in extent and intensity. The duration over which impacts of selective <span class="hlt">logging</span> persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective <span class="hlt">logging</span> event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.<span class="hlt">2</span> ha area in Kibale National Park, Uganda, in <span class="hlt">logged</span> and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively <span class="hlt">logged</span> forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively <span class="hlt">logged</span> forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of <span class="hlt">logging</span> history may be due to functional recovery of <span class="hlt">logged</span> forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective <span class="hlt">logging</span> on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in <span class="hlt">logged</span> forest compared with unlogged forest. The effects of selective <span class="hlt">logging</span> are exerted via influences on tree demography rather than functional trait composition. PMID:27293697</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25272935','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25272935"><span>Determination of <span class="hlt">log</span> P values of new cyclen based antimalarial drug leads using RP-HPLC.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rudraraju, A V; Amoyaw, P N A; Hubin, T J; Khan, M O F</p> <p>2014-09-01</p> <p>Lipophilicity, expressed by <span class="hlt">log</span> P, is an important physicochemical property of drugs that affects many biological processes, including drug absorption and distribution. The main purpose of this study to determine the <span class="hlt">log</span> P values of newly discovered drug leads using reversed-phase high-performance liquid chromatography (RP-HPLC). The reference standards, with varying polarity ranges, were dissolved in methanol and analyzed by RP-HPLC using a C18 column. The mobile phase consisted of a mixture of acetonitrile, methanol and water in a gradient elution mode. A calibration curve was plotted between the experimental <span class="hlt">log</span> P values and obtained <span class="hlt">log</span> k values of the reference standard compounds and a best fit line was obtained. The <span class="hlt">log</span> k values of the new drug leads were determined in the same solvent system and were used to calculate the respective <span class="hlt">log</span> P values by using the best fit equation. The <span class="hlt">log</span> P vs. <span class="hlt">log</span> k data gave a best fit linear curve that had an R<span class="hlt">2</span> of 0.9786 with Pvalues of the intercept and slope of 1.19 x 10(-6) and 1.56 x 10(-10), respectively, at 0.05 level of significance. <span class="hlt">Log</span> P values of 15 new drug leads and related compounds, all of which are derivatives of macrocyclic polyamines and their metal complexes, were determined. The values obtained are closely related to the calculated <span class="hlt">log</span> P (Clog P) values using ChemDraw Ultra 12.0. This experiment provided efficient, fast and reasonable estimates of <span class="hlt">log</span> P values of the new drug leads by using RP-HPLC.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/21054774','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/21054774"><span>Performance and Improved Design of the <span class="hlt">Log</span> Spiral of Revolution Monochromator</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Pease, D. M.; Shanthakumar, P.; Huang, T.; Budnick, J. I.; Odong, O.; Frenkel, A. I.; Abitbol, N.; Balasubramanian, M.; Brewer, D.</p> <p>2007-02-02</p> <p>We show that, whereas the present <span class="hlt">log</span> spiral of revolution monochromator works well for Cr edges of <span class="hlt">2</span>.8% Cr<span class="hlt">2</span>O3 in a V<span class="hlt">2</span>O3 matrix, the device transmits noticeable V extended structure in the case of 0.365% Cr<span class="hlt">2</span>O3. We demonstrate that the transmitted V extended structure is due to the V K{beta} line which is unresolved by the monochromator. It is suggested that this limitation may be overcome by designing a <span class="hlt">log</span> spiral detector for the Cr K{beta} line rather than the Cr K{alpha} line. Aspects of the design of this modified <span class="hlt">log</span> spiral are discussed.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23436682','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23436682"><span>Antedependence models for nonstationary categorical longitudinal data with ignorable missingness: <span class="hlt">likelihood</span>-based inference.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Xie, Yunlong; Zimmerman, Dale L</p> <p>2013-08-30</p> <p>Time index-ordered random variables are said to be antedependent (AD) of order (p1 ,p<span class="hlt">2</span> , … ,pn ) if the kth variable, conditioned on the pk immediately preceding variables, is independent of all further preceding variables. Inferential methods associated with AD models are well developed for continuous (primarily normal) longitudinal data, but not for categorical longitudinal data. In this article, we develop <span class="hlt">likelihood</span>-based inferential procedures for unstructured AD models for categorical longitudinal data. Specifically, we derive maximum <span class="hlt">likelihood</span> estimators (MLEs) of model parameters; penalized <span class="hlt">likelihood</span> criteria and <span class="hlt">likelihood</span> ratio tests for determining the order of antedependence; and <span class="hlt">likelihood</span> ratio tests for homogeneity across groups, time invariance of transition probabilities, and strict stationarity. We give closed-form expressions for MLEs and test statistics, which allow for the possibility of empty cells and monotone missing data, for all cases save strict stationarity. For data with an arbitrary missingness pattern, we derive an efficient restricted expectation-maximization algorithm for obtaining MLEs. We evaluate the performance of the tests by simulation. We apply the methods to longitudinal studies of toenail infection severity (measured on a binary scale) and Alzheimer's disease severity (measured on an ordinal scale). The analysis of the toenail infection severity data reveals interesting nonstationary behavior of the transition probabilities and indicates that an unstructured first-order AD model is superior to stationary and other structured first-order AD models that have previously been fit to these data. The analysis of the Alzheimer's severity data indicates that the antedependence is second order with time-invariant transition probabilities, suggesting the use of a second-order autoregressive cumulative logit model.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011ESASP.695E..33H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011ESASP.695E..33H"><span>Maximum <span class="hlt">Likelihood</span> Shift Estimation Using High Resolution Polarimetric SAR Clutter Model</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Harant, Olivier; Bombrun, Lionel; Vasile, Gabriel; Ferro-Famil, Laurent; Gay, Michel</p> <p>2011-03-01</p> <p>This paper deals with a Maximum <span class="hlt">Likelihood</span> (ML) shift estimation method in the context of High Resolution (HR) Polarimetric SAR (PolSAR) clutter. Texture modeling is exposed and the generalized ML texture tracking method is extended to the merging of various sensors. Some results on displacement estimation on the Argentiere glacier in the Mont Blanc massif using dual-pol TerraSAR-X (TSX) and quad-pol RADARSAT-<span class="hlt">2</span> (RS<span class="hlt">2</span>) sensors are finally discussed.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.usgs.gov/of/1986/0544/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/1986/0544/report.pdf"><span>Preliminary report on geophysical well-<span class="hlt">logging</span> activity on the Salton Sea Scientific Drilling Project, Imperial Valley, California</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Paillet, Frederick L.; Morin, R.H.; Hodges, H.E.</p> <p>1986-01-01</p> <p>The Salton Sea Scientific Drilling Project has culminated in a 10,564-ft deep test well, State <span class="hlt">2</span>-14 well, in the Imperial Valley of southern California. A comprehensive scientific program of drilling, coring, and downhole measurements, which was conducted for about 5 months, has obtained much scientific information concerning the physical and chemical processes associated with an active hydrothermal system. This report primarily focuses on the geophysical <span class="hlt">logging</span> activities at the State <span class="hlt">2</span>-14 well and provides early dissemination of geophysical data to other investigators working on complementary studies. Geophysical-<span class="hlt">log</span> data were obtained by a commercial <span class="hlt">logging</span> company and by the U.S. Geological Survey (USGS). Most of the commercial <span class="hlt">logs</span> were obtained during three visits to the site; only one commercial <span class="hlt">log</span> was obtained below a depth of 6,000 ft. The commercial <span class="hlt">logs</span> obtained were dual induction, natural gamma, compensated neutron formation density, caliper and sonic. The USGS <span class="hlt">logging</span> effort consisted of four primary periods, with many <span class="hlt">logs</span> extending below a depth of 6,000 ft. The USGS <span class="hlt">logs</span> obtained were temperature, caliper, natural gamma, gamma spectral, epithermal neutron, acoustic velocity, full-waveform, and acoustic televiewer. Various problems occurred throughout the drilling phase of the Salton Sea Scientific Drilling Project that made successful <span class="hlt">logging</span> difficult: (1) borehole constrictions, possibly resulting from mud coagulation, (<span class="hlt">2</span>) maximum temperatures of about 300 C, and (3) borehole conditions unfavorable for <span class="hlt">logging</span> because of numerous zones of fluid loss, cement plugs, and damage caused by repeated trips in and out of the hole. These factors hampered and compromised <span class="hlt">logging</span> quality at several open-hole intervals. The quality of the <span class="hlt">logs</span> was dependent on the degree of probe sophistication and sensitivity to borehole-wall conditions. Digitized <span class="hlt">logs</span> presented were processed on site and are presented in increments of 1,000 ft. A summary of the numerous</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA249442','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA249442"><span>Maximum <span class="hlt">Likelihood</span> Detection of Electro-Optic Moving Targets</span></a></p> <p><a target="_blank" href="https://publicaccess.dtic.mil/psm/api/service/search/search">DTIC Science & Technology</a></p> <p></p> <p>1992-01-16</p> <p>The description of a maximum <span class="hlt">likelihood</span> algorithm to detect moving targets in electro - optic data is presented. The algorithm is based on processing...optimum algorithm to determine the performance loss. A processing architecture concept is also described. Electro - optic sensor, detection, infrared sensor, moving target, binary integration, velocity filter.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3004025','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3004025"><span>Multiple Cognitive Control Effects of Error <span class="hlt">Likelihood</span> and Conflict</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Brown, Joshua W.</p> <p>2010-01-01</p> <p>Recent work on cognitive control has suggested a variety of performance monitoring functions of the anterior cingulate cortex, such as errors, conflict, error <span class="hlt">likelihood</span>, and others. Given the variety of monitoring effects, a corresponding variety of control effects on behavior might be expected. This paper explores whether conflict and error <span class="hlt">likelihood</span> produce distinct cognitive control effects on behavior, as measured by response time. A change signal task (Brown & Braver, 2005) was modified to include conditions of likely errors due to tardy as well as premature responses, in conditions with and without conflict. The results discriminate between competing hypotheses of independent vs. interacting conflict and error <span class="hlt">likelihood</span> control effects. Specifically, the results suggest that the <span class="hlt">likelihood</span> of premature vs. tardy response errors can lead to multiple distinct control effects, which are independent of cognitive control effects driven by response conflict. As a whole, the results point to the existence of multiple distinct cognitive control mechanisms and challenge existing models of cognitive control that incorporate only a single control signal. PMID:19030873</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=adaptive+AND+filter&id=ED268132','ERIC'); return false;" href="http://eric.ed.gov/?q=adaptive+AND+filter&id=ED268132"><span><span class="hlt">Likelihood</span> Methods for Adaptive Filtering and Smoothing. Technical Report #455.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Butler, Ronald W.</p> <p></p> <p>The dynamic linear model or Kalman filtering model provides a useful methodology for predicting the past, present, and future states of a dynamic system, such as an object in motion or an economic or social indicator that is changing systematically with time. Recursive <span class="hlt">likelihood</span> methods for adaptive Kalman filtering and smoothing are developed.…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=spline&pg=3&id=EJ435122','ERIC'); return false;" href="http://eric.ed.gov/?q=spline&pg=3&id=EJ435122"><span>Maximum Marginal <span class="hlt">Likelihood</span> Estimation for Semiparametric Item Analysis.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ramsay, J. O.; Winsberg, S.</p> <p>1991-01-01</p> <p>A method is presented for estimating the item characteristic curve (ICC) using polynomial regression splines. Estimation of spline ICCs is described by maximizing the marginal <span class="hlt">likelihood</span> formed by integrating ability over a beta prior distribution. Simulation results compare this approach with the joint estimation of ability and item parameters.…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=%22confidence+interval%22&pg=4&id=EJ933088','ERIC'); return false;" href="http://eric.ed.gov/?q=%22confidence+interval%22&pg=4&id=EJ933088"><span><span class="hlt">Likelihood</span>-Based Confidence Intervals in Exploratory Factor Analysis</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Oort, Frans J.</p> <p>2011-01-01</p> <p>In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how <span class="hlt">likelihood</span>-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=mixtures&pg=2&id=EJ914085','ERIC'); return false;" href="http://eric.ed.gov/?q=mixtures&pg=2&id=EJ914085"><span>Mixture Rasch Models with Joint Maximum <span class="hlt">Likelihood</span> Estimation</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Willse, John T.</p> <p>2011-01-01</p> <p>This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum <span class="hlt">likelihood</span> is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1462571','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1462571"><span>Maximum-<span class="hlt">likelihood</span> estimation of admixture proportions from genetic data.</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wang, Jinliang</p> <p>2003-01-01</p> <p>For an admixed population, an important question is how much genetic contribution comes from each parental population. Several methods have been developed to estimate such admixture proportions, using data on genetic markers sampled from parental and admixed populations. In this study, I propose a <span class="hlt">likelihood</span> method to estimate jointly the admixture proportions, the genetic drift that occurred to the admixed population and each parental population during the period between the hybridization and sampling events, and the genetic drift in each ancestral population within the interval between their split and hybridization. The results from extensive simulations using various combinations of relevant parameter values show that in general much more accurate and precise estimates of admixture proportions are obtained from the <span class="hlt">likelihood</span> method than from previous methods. The <span class="hlt">likelihood</span> method also yields reasonable estimates of genetic drift that occurred to each population, which translate into relative effective sizes (N(e)) or absolute average N(e)'s if the times when the relevant events (such as population split, admixture, and sampling) occurred are known. The proposed <span class="hlt">likelihood</span> method also has features such as relatively low computational requirement compared with previous ones, flexibility for admixture models, and marker types. In particular, it allows for missing data from a contributing parental population. The method is applied to a human data set and a wolflike canids data set, and the results obtained are discussed in comparison with those from other estimators and from previous studies. PMID:12807794</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28263816','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28263816"><span>Cases In Which Ancestral Maximum <span class="hlt">Likelihood</span> Will Be Confusingly Misleading.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Handelman, Tomer; Chor, Benny</p> <p>2017-03-02</p> <p>Ancestral maximum <span class="hlt">likelihood</span> (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum <span class="hlt">likelihood</span> (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mosel, Roch, and Steel (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral <span class="hlt">likelihood</span> optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral <span class="hlt">likelihood</span>. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=Object+AND+Oriented+AND+Data+AND+Analysis&pg=3&id=EJ631336','ERIC'); return false;" href="http://eric.ed.gov/?q=Object+AND+Oriented+AND+Data+AND+Analysis&pg=3&id=EJ631336"><span>A Unified Maximum <span class="hlt">Likelihood</span> Approach to Document Retrieval.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bodoff, David; Enache, Daniel; Kambil, Ajit; Simon, Gary; Yukhimets, Alex</p> <p>2001-01-01</p> <p>Addresses the query- versus document-oriented dichotomy in information retrieval. Introduces a maximum <span class="hlt">likelihood</span> approach to utilizing feedback data that can be used to construct a concrete object function that estimates both document and query parameters in accordance with all available feedback data. (AEF)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('//www.loc.gov/pictures/collection/hh/item/or0382.photos.354616p/','SCIGOV-HHH'); return false;" href="//www.loc.gov/pictures/collection/hh/item/or0382.photos.354616p/"><span>1. GENERAL VIEW OF <span class="hlt">LOG</span> POND AND BOOM FOR UNLOADING ...</span></a></p> <p><a target="_blank" href="http://www.loc.gov/pictures/collection/hh/">Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey</a></p> <p></p> <p></p> <p>1. GENERAL VIEW OF <span class="hlt">LOG</span> POND AND BOOM FOR UNLOADING CEDAR <span class="hlt">LOGS</span> FROM TRUCKS AT <span class="hlt">LOG</span> DUMP, ADJACENT TO MILL; TRUCKS FORMERLY USED TRIP STAKES, THOUGH FOR SAFER HANDLING OF <span class="hlt">LOGS</span> WELDED STAKES ARE NOW REQUIRED; AS A RESULT LOADING IS NOW DONE WITH A CRANE - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title32-vol5/pdf/CFR-2013-title32-vol5-sec700-846.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title32-vol5/pdf/CFR-2013-title32-vol5-sec700-846.pdf"><span>32 CFR 700.846 - Status of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... 32 National Defense 5 2013-07-01 2013-07-01 false Status of <span class="hlt">logs</span>. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of <span class="hlt">logs</span>. The deck <span class="hlt">log</span>, the engineering <span class="hlt">log</span>, the compass record,...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-877.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-877.pdf"><span>47 CFR 73.877 - Station <span class="hlt">logs</span> for LPFM stations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... 47 Telecommunication 4 2012-10-01 2012-10-01 false Station <span class="hlt">logs</span> for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station <span class="hlt">logs</span> for LPFM stations. The licensee of each LPFM station must maintain a station <span class="hlt">log</span>. Each <span class="hlt">log</span> entry must include the time and date...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-782.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-782.pdf"><span>47 CFR 73.782 - Retention of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... 47 Telecommunication 4 2011-10-01 2011-10-01 false Retention of <span class="hlt">logs</span>. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of <span class="hlt">logs</span>. <span class="hlt">Logs</span> of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That <span class="hlt">logs</span>...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title32-vol5/pdf/CFR-2011-title32-vol5-sec700-846.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title32-vol5/pdf/CFR-2011-title32-vol5-sec700-846.pdf"><span>32 CFR 700.846 - Status of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... 32 National Defense 5 2011-07-01 2011-07-01 false Status of <span class="hlt">logs</span>. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of <span class="hlt">logs</span>. The deck <span class="hlt">log</span>, the engineering <span class="hlt">log</span>, the compass record,...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title32-vol5/pdf/CFR-2014-title32-vol5-sec700-846.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title32-vol5/pdf/CFR-2014-title32-vol5-sec700-846.pdf"><span>32 CFR 700.846 - Status of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... 32 National Defense 5 2014-07-01 2014-07-01 false Status of <span class="hlt">logs</span>. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of <span class="hlt">logs</span>. The deck <span class="hlt">log</span>, the engineering <span class="hlt">log</span>, the compass record,...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title33-vol3/pdf/CFR-2011-title33-vol3-sec207-370.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title33-vol3/pdf/CFR-2011-title33-vol3-sec207-370.pdf"><span>33 CFR 207.370 - Big Fork River, Minn.; <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-07-01</p> <p>... of navigation, parties engaged in handling <span class="hlt">logs</span> upon the river shall have the right to sluice, drive, and float <span class="hlt">logs</span> in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each <span class="hlt">log</span> drive to prevent the formation of <span class="hlt">log</span> jams and to maintain an...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-782.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-782.pdf"><span>47 CFR 73.782 - Retention of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of <span class="hlt">logs</span>. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of <span class="hlt">logs</span>. <span class="hlt">Logs</span> of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That <span class="hlt">logs</span>...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-877.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-877.pdf"><span>47 CFR 73.877 - Station <span class="hlt">logs</span> for LPFM stations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... 47 Telecommunication 4 2014-10-01 2014-10-01 false Station <span class="hlt">logs</span> for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station <span class="hlt">logs</span> for LPFM stations. The licensee of each LPFM station must maintain a station <span class="hlt">log</span>. Each <span class="hlt">log</span> entry must include the time and date...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title33-vol3/pdf/CFR-2014-title33-vol3-sec207-370.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title33-vol3/pdf/CFR-2014-title33-vol3-sec207-370.pdf"><span>33 CFR 207.370 - Big Fork River, Minn.; <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-07-01</p> <p>... of navigation, parties engaged in handling <span class="hlt">logs</span> upon the river shall have the right to sluice, drive, and float <span class="hlt">logs</span> in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each <span class="hlt">log</span> drive to prevent the formation of <span class="hlt">log</span> jams and to maintain an...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-782.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-782.pdf"><span>47 CFR 73.782 - Retention of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-10-01</p> <p>... 47 Telecommunication 4 2013-10-01 2013-10-01 false Retention of <span class="hlt">logs</span>. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of <span class="hlt">logs</span>. <span class="hlt">Logs</span> of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That <span class="hlt">logs</span>...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title32-vol5/pdf/CFR-2012-title32-vol5-sec700-846.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title32-vol5/pdf/CFR-2012-title32-vol5-sec700-846.pdf"><span>32 CFR 700.846 - Status of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... 32 National Defense 5 2012-07-01 2012-07-01 false Status of <span class="hlt">logs</span>. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of <span class="hlt">logs</span>. The deck <span class="hlt">log</span>, the engineering <span class="hlt">log</span>, the compass record,...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title33-vol3/pdf/CFR-2013-title33-vol3-sec207-370.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title33-vol3/pdf/CFR-2013-title33-vol3-sec207-370.pdf"><span>33 CFR 207.370 - Big Fork River, Minn.; <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-07-01</p> <p>... of navigation, parties engaged in handling <span class="hlt">logs</span> upon the river shall have the right to sluice, drive, and float <span class="hlt">logs</span> in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each <span class="hlt">log</span> drive to prevent the formation of <span class="hlt">log</span> jams and to maintain an...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-782.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title47-vol4/pdf/CFR-2012-title47-vol4-sec73-782.pdf"><span>47 CFR 73.782 - Retention of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-10-01</p> <p>... 47 Telecommunication 4 2012-10-01 2012-10-01 false Retention of <span class="hlt">logs</span>. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of <span class="hlt">logs</span>. <span class="hlt">Logs</span> of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That <span class="hlt">logs</span>...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-877.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-877.pdf"><span>47 CFR 73.877 - Station <span class="hlt">logs</span> for LPFM stations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... 47 Telecommunication 4 2011-10-01 2011-10-01 false Station <span class="hlt">logs</span> for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station <span class="hlt">logs</span> for LPFM stations. The licensee of each LPFM station must maintain a station <span class="hlt">log</span>. Each <span class="hlt">log</span> entry must include the time and date...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-877.pdf','CFR2013'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2013-title47-vol4/pdf/CFR-2013-title47-vol4-sec73-877.pdf"><span>47 CFR 73.877 - Station <span class="hlt">logs</span> for LPFM stations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2013&page.go=Go">Code of Federal Regulations, 2013 CFR</a></p> <p></p> <p>2013-10-01</p> <p>... 47 Telecommunication 4 2013-10-01 2013-10-01 false Station <span class="hlt">logs</span> for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station <span class="hlt">logs</span> for LPFM stations. The licensee of each LPFM station must maintain a station <span class="hlt">log</span>. Each <span class="hlt">log</span> entry must include the time and date...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-877.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title47-vol4/pdf/CFR-2010-title47-vol4-sec73-877.pdf"><span>47 CFR 73.877 - Station <span class="hlt">logs</span> for LPFM stations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-10-01</p> <p>... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station <span class="hlt">logs</span> for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station <span class="hlt">logs</span> for LPFM stations. The licensee of each LPFM station must maintain a station <span class="hlt">log</span>. Each <span class="hlt">log</span> entry must include the time and date...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title32-vol5/pdf/CFR-2010-title32-vol5-sec700-846.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title32-vol5/pdf/CFR-2010-title32-vol5-sec700-846.pdf"><span>32 CFR 700.846 - Status of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... 32 National Defense 5 2010-07-01 2010-07-01 false Status of <span class="hlt">logs</span>. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of <span class="hlt">logs</span>. The deck <span class="hlt">log</span>, the engineering <span class="hlt">log</span>, the compass record,...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2012-title33-vol3/pdf/CFR-2012-title33-vol3-sec207-370.pdf','CFR2012'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2012-title33-vol3/pdf/CFR-2012-title33-vol3-sec207-370.pdf"><span>33 CFR 207.370 - Big Fork River, Minn.; <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2012&page.go=Go">Code of Federal Regulations, 2012 CFR</a></p> <p></p> <p>2012-07-01</p> <p>... of navigation, parties engaged in handling <span class="hlt">logs</span> upon the river shall have the right to sluice, drive, and float <span class="hlt">logs</span> in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each <span class="hlt">log</span> drive to prevent the formation of <span class="hlt">log</span> jams and to maintain an...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title33-vol3/pdf/CFR-2010-title33-vol3-sec207-370.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title33-vol3/pdf/CFR-2010-title33-vol3-sec207-370.pdf"><span>33 CFR 207.370 - Big Fork River, Minn.; <span class="hlt">logging</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... of navigation, parties engaged in handling <span class="hlt">logs</span> upon the river shall have the right to sluice, drive, and float <span class="hlt">logs</span> in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each <span class="hlt">log</span> drive to prevent the formation of <span class="hlt">log</span> jams and to maintain an...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-782.pdf','CFR2014'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2014-title47-vol4/pdf/CFR-2014-title47-vol4-sec73-782.pdf"><span>47 CFR 73.782 - Retention of <span class="hlt">logs</span>.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2014&page.go=Go">Code of Federal Regulations, 2014 CFR</a></p> <p></p> <p>2014-10-01</p> <p>... 47 Telecommunication 4 2014-10-01 2014-10-01 false Retention of <span class="hlt">logs</span>. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of <span class="hlt">logs</span>. <span class="hlt">Logs</span> of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That <span class="hlt">logs</span>...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.B13B0616K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.B13B0616K"><span>Dissociation behavior of Methane Hydrate presumed by NMR <span class="hlt">log</span> analysis</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kotera, T.</p> <p>2015-12-01</p> <p>This study has been conducted with the aim of comprehending dissociation behavior of MH. The production test was operated in the Daini-Atsumi knoll in the eastern Nankai Trough, Japan. We corresponded the NMR <span class="hlt">log</span> data acquired from the three wells, which drilled before the test (AT1-MC) and after the test (AT1-LWD1 and AT1-LWD<span class="hlt">2</span>). NMR <span class="hlt">log</span> measures T<span class="hlt">2</span> relaxation time, and calculates porosity and permeability. We especially focused on the T<span class="hlt">2</span> distribution. It is assumed that MH occupied larger pore space in the sandy sediment because the T<span class="hlt">2</span> distribution in the MH bearing layer has no peak in the longer time zone at the AT1-MC. However, T<span class="hlt">2</span> peak appeared over 33ms zone at the AT1-LWD1 and AT1-LWD<span class="hlt">2</span>. This phenomenon is observed in the thin MH bearing layers rather than thick one. On the other hand, nothing T<span class="hlt">2</span> peak appeared over the 33ms zone in the thick MH bearing layer, but T<span class="hlt">2</span> distribution shifts to longer relaxation time in the short time interval. Hence, it is assumed that the MH was dissociated from the contact faces with the grain. In terms of the thermal conductivity, near the grain-grain contact faces are more dissociable than the MH-grain contact; however both of dissociation zones are essentially MH-grain contact faces. Nothing or few MH was observed in the muddy layer at the coring campaign near these wells. Abovementioned, NMR <span class="hlt">logging</span> detected various changes on the T<span class="hlt">2</span> distribution. It seems to indicate the dissociation of MH. And these data gets into alignment with other <span class="hlt">log</span> data and monitoring data, which are resistivity and temperature measurement. However, as this <span class="hlt">logging</span> data acquired from each location, there is possibility that the initial condition was originally distinct. This research was conducted as a part of the MH21 research, and the authors would like to express their sincere appreciation to MH21 and the Ministry of Economy, Trade and Industry for providing the permission to disclose this research.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=INSECTS&pg=6&id=EJ758222','ERIC'); return false;" href="http://eric.ed.gov/?q=INSECTS&pg=6&id=EJ758222"><span>There's Life in Those Dead <span class="hlt">Logs</span>!</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Biggs, Devin; Miller, Todd; Hall, Dee</p> <p>2006-01-01</p> <p>Although it is unspectacular in appearance, dead wood is one of the most ecologically important resources in forests. Fallen <span class="hlt">logs</span>, dead standing trees, stumps, and even cavities in live trees fulfill a wide range of roles. Prominent among these is that they provide habitat for many organisms, especially insects. Fourth-grade students at Fox…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MNRAS.464L..21R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MNRAS.464L..21R"><span>Precision prediction of the <span class="hlt">log</span> power spectrum</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Repp, A.; Szapudi, I.</p> <p>2017-01-01</p> <p>At translinear scales, the <span class="hlt">log</span> power spectrum captures significantly more cosmological information than the standard power spectrum. At high wavenumbers k, the Fisher information in the standard power spectrum P(k) fails to increase in proportion to k, in part due to correlations between large- and small-scale modes. As a result, P(k) suffers from an information plateau on these translinear scales, so that analysis with the standard power spectrum cannot access the information contained in these small-scale modes. The <span class="hlt">log</span> power spectrum PA(k), on the other hand, captures the majority of this otherwise lost information. Until now there has been no means of predicting the amplitude of the <span class="hlt">log</span> power spectrum apart from cataloging the results of simulations. We here present a cosmology-independent prescription for the <span class="hlt">log</span> power spectrum; this prescription displays accuracy comparable to that of Smith et al., over a range of redshifts and smoothing scales, and for wavenumbers up to 1.5 h Mpc-1.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12187644','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12187644"><span>[Human development and <span class="hlt">log</span>-periodic law].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cash, Roland; Chaline, Jean; Nottale, Laurent; Grou, Pierre</p> <p>2002-05-01</p> <p>We suggest applying the <span class="hlt">log</span>-periodic law formerly used to describe various crisis phenomena, in biology (evolutionary leaps), inorganic systems (earthquakes), societies and economy (economic crisis, market crashes) to the various steps of human ontogeny. We find a statistically significant agreement between this model and the data.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol5/pdf/CFR-2010-title29-vol5-sec1910-266.pdf','CFR'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2010-title29-vol5/pdf/CFR-2010-title29-vol5-sec1910-266.pdf"><span>29 CFR 1910.266 - <span class="hlt">Logging</span> operations.</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2010&page.go=Go">Code of Federal Regulations, 2010 CFR</a></p> <p></p> <p>2010-07-01</p> <p>... other device is then hooked to the lifting or pulling machine. Danger tree. A standing tree that... delivery, such as, but not limited to, marking danger trees and trees/<span class="hlt">logs</span> to be cut to length, felling... danger tree shall be felled, removed or avoided. Each danger tree, including lodged trees and...</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11411253','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11411253"><span>Data <span class="hlt">logging</span> technology in ambulatory medical instrumentation.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Anderson, R; Lyons, G M</p> <p>2001-05-01</p> <p>This paper reviews the advancements made in ambulatory data <span class="hlt">logging</span> used in the study of human subjects since the inception of the analogue tape based data logger in the 1960s. Research into the area of ambulatory monitoring has been rejuvenated due to the development of novel storage technologies during the 1990s. Data <span class="hlt">logging</span> systems that were previously impractical due to lack of processing power, practical size and cost are now available to the practitioner. An overview of the requirements of present day ambulatory data <span class="hlt">logging</span> is presented and analogue tape, solid-state memory and disk drive storage recording systems that have been described in the literature are investigated in detail. It is proposed that digital based technology offers the best solution to the problems encountered during human based data <span class="hlt">logging</span>. The appearance of novel digital storage media will continue the trend of increased recording durations, signal resolution and number of parameters thus allowing the momentum gained throughout the last several decades to continue.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1513547C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1513547C"><span>Modelling tropical forests response to <span class="hlt">logging</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco</p> <p>2013-04-01</p> <p>Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy <span class="hlt">logging</span> and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective <span class="hlt">logging</span> to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective <span class="hlt">logging</span> on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective <span class="hlt">logging</span> to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=Spivak&pg=4&id=EJ188691','ERIC'); return false;" href="http://eric.ed.gov/?q=Spivak&pg=4&id=EJ188691"><span>The Design <span class="hlt">Log</span>: A New Informational Tool</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Spivak, Mayer</p> <p>1978-01-01</p> <p>The design <span class="hlt">log</span> is a record of observations, diagnoses, prescriptions, and performance specifications for each space in a structure. It is a systematic approach to design that integrates information about user needs with traditional architectural programming and design. (Author/MLF)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/6110354','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/6110354"><span>A distributed design for monitoring, <span class="hlt">logging</span>, and replaying device readings at LAMPF</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Burns, M.</p> <p>1991-01-01</p> <p>As control of the Los Alamos Meson Physics linear accelerator and Proton Storage Ring moves to a more distributed system, it has been necessary to redesign the software which monitors, <span class="hlt">logs</span>, and replays device readings throughout the facility. The new design allows devices to be monitored and their readings <span class="hlt">logged</span> locally on a network of computers. Control of the monitoring and <span class="hlt">logging</span> process is available throughout the network from user interfaces which communicate via remote procedure calls with server processes running on each node which monitors and records device readings. Similarly, the <span class="hlt">logged</span> data can be replayed from anywhere on the network. Two major requirements influencing the final design were the need to reduce the load on the CPU of the control machines, and the need for much faster replay of the <span class="hlt">logged</span> device readings. 1 ref., <span class="hlt">2</span> figs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://pubs.er.usgs.gov/publication/70015255','USGSPUBS'); return false;" href="http://pubs.er.usgs.gov/publication/70015255"><span>GEOLOGICAL APPLICATIONS OF WELL <span class="hlt">LOGS</span>: AN INTRODUCTORY BIBLIOGRAPHY AND SURVEY OF THE WELL <span class="hlt">LOGGING</span> LITERATURE THROUGH SEPTEMBER 1986 ARRANGED BY SUBJECT.</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Prensky, Stephen E.</p> <p>1987-01-01</p> <p>This report includes over 1,350 individual citations as well as a first-author index. The purpose of this bibliography is twofold, 1) to provide a basic, first-stop resource on well <span class="hlt">logging</span> which the non-specialist, i. e. , geoscientist, can consult and, <span class="hlt">2</span>) to provide a reference on geologic applications for the non-geoscientist, i. e. , <span class="hlt">log</span> analyst or petroleum engineer, as well as for the geoscientist.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2758850','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2758850"><span>PROCOV: maximum <span class="hlt">likelihood</span> estimation of protein phylogeny under covarion models and site-specific covarion pattern analysis</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wang, Huai-Chun; Susko, Edward; Roger, Andrew J</p> <p>2009-01-01</p> <p>Background The covarion hypothesis of molecular evolution holds that selective pressures on a given amino acid or nucleotide site are dependent on the identity of other sites in the molecule that change throughout time, resulting in changes of evolutionary rates of sites along the branches of a phylogenetic tree. At the sequence level, covarion-like evolution at a site manifests as conservation of nucleotide or amino acid states among some homologs where the states are not conserved in other homologs (or groups of homologs). Covarion-like evolution has been shown to relate to changes in functions at sites in different clades, and, if ignored, can adversely affect the accuracy of phylogenetic inference. Results PROCOV (protein covarion analysis) is a software tool that implements a number of previously proposed covarion models of protein evolution for phylogenetic inference in a maximum <span class="hlt">likelihood</span> framework. Several algorithmic and implementation improvements in this tool over previous versions make computationally expensive tree searches with covarion models more efficient and analyses of large phylogenomic data sets tractable. PROCOV can be used to identify covarion sites by comparing the site <span class="hlt">likelihoods</span> under the covarion process to the corresponding site <span class="hlt">likelihoods</span> under a rates-across-sites (RAS) process. Those sites with the greatest <span class="hlt">log-likelihood</span> difference between a 'covarion' and an RAS process were found to be of functional or structural significance in a dataset of bacterial and eukaryotic elongation factors. Conclusion Covarion models implemented in PROCOV may be especially useful for phylogenetic estimation when ancient divergences between sequences have occurred and rates of evolution at sites are likely to have changed over the tree. It can also be used to study lineage-specific functional shifts in protein families that result in changes in the patterns of site variability among subtrees. PMID:19737395</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1065023','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1065023"><span>Critical care procedure <span class="hlt">logging</span> using handheld computers</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Carlos Martinez-Motta, J; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E</p> <p>2004-01-01</p> <p>Introduction We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure <span class="hlt">logging</span> system in a critical care training program. Methods Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for <span class="hlt">logging</span> critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. Results All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were <span class="hlt">logged</span>. Significant variability was noted in the number of procedures <span class="hlt">logged</span> by individual trainees (range 13–242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. Conclusion A handheld computer procedure <span class="hlt">logging</span> system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation. PMID:15469577</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20150004709&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dlog','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20150004709&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dlog"><span>Requirements-Driven <span class="hlt">Log</span> Analysis Extended Abstract</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Havelund, Klaus</p> <p>2012-01-01</p> <p>Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing <span class="hlt">logging</span> infrastructure, which after all is part of most systems put in production. A <span class="hlt">log</span> is a sequence of events, generated by special <span class="hlt">log</span> recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a <span class="hlt">log</span> using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such <span class="hlt">log</span> analysis, compared to the entire testing team changing behavior.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009ApJ...692.1669R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009ApJ...692.1669R"><span>Cosmic Microwave Background <span class="hlt">Likelihood</span> Approximation by a Gaussianized Blackwell-Rao Estimator</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rudjord, Ø.; Groeneboom, N. E.; Eriksen, H. K.; Huey, Greg; Górski, K. M.; Jewell, J. B.</p> <p>2009-02-01</p> <p>We introduce a new cosmic microwave background (CMB) temperature <span class="hlt">likelihood</span> approximation called the Gaussianized Blackwell-Rao estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximating their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck. The result is a stable, accurate, and computationally very efficient CMB temperature <span class="hlt">likelihood</span> representation that allows the user to exploit the unique error propagation capabilities of the Gibbs sampler to high ells. A single evaluation of this estimator between ell = <span class="hlt">2</span> and 200 takes ~0.<span class="hlt">2</span> CPU milliseconds, while for comparison, a singe pixel space <span class="hlt">likelihood</span> evaluation between ell = <span class="hlt">2</span> and 30 for a map with ~2500 pixels requires ~20 s. We apply this tool to the five-year WMAP temperature data, and re-estimate the angular temperature power spectrum, C ell, and <span class="hlt">likelihood</span>, L(C_{ℓ}), for ell <= 200, and derive new cosmological parameters for the standard six-parameter ΛCDM model. Our spectrum is in excellent agreement with the official WMAP spectrum, but we find slight differences in the derived cosmological parameters. Most importantly, the spectral index of scalar perturbations is ns = 0.973 ± 0.014, 1.9σ away from unity and 0.6σ higher than the official WMAP result, ns = 0.965 ± 0.014. This suggests that an exact <span class="hlt">likelihood</span> treatment is required to higher ells than previously believed, reinforcing and extending our conclusions from the three-year WMAP analysis. In that case, we found that the suboptimal <span class="hlt">likelihood</span> approximation adopted between ell = 12 and 30 by the WMAP team biased ns low by 0.4σ, while here we find that the same</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/468248','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/468248"><span>Improved production <span class="hlt">log</span> interpretation in horizontal wells using pulsed neutron <span class="hlt">logs</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Brady, J.L.; Kohring, J.J.; North, R.J.</p> <p>1996-12-31</p> <p>Production <span class="hlt">log</span> flow profiles provide a valuable tool to evaluate well and reservoir performance. Horizontal wellbores and their associated completion designs present several challenges to profile interpretation for conventional production <span class="hlt">logging</span> sensors and techniques. A unique approach combining pulsed neutron capture (PNC) <span class="hlt">log</span> data with conventional production <span class="hlt">logging</span> measurements is providing improved flow profile answers in slotted liner, horizontal well completions on the North Slope of Alaska. Identifying and eliminating undesirable gas production is one of the chief goals of production <span class="hlt">logging</span> on the North Slope. This process becomes difficult in horizontal wellbores as fluid segregation affects the area investigated by the various <span class="hlt">logging</span> sensors and also the velocities of the individual phases. Typical slotted liner completions further complicate analysis as fluids are able to flow in the liner/openhole annulus. Analysis of PNC <span class="hlt">log</span> data provides two good qualitative indicators of formation permeability. The first technique is derived from the difference of the formation sigma response before and after injecting a high-capture cross-section borax solution. The second technique uses the difference of the formation sigma response and the formation porosity measured while injecting the formation with crude or seawater. Further analysis of PNC <span class="hlt">log</span> runs show that the two techniques closely correlate with production flow profiles under solution gas-oil ratio (GOR) conditions. These two techniques in combination with conventional production <span class="hlt">logging</span> measurements of temperature, capacitance, pressure, and spinner improve flow profile results. PNC results can be combined with temperature and pressure data in the absence of valid spinner data to provide an approximate flow profile. These techniques have been used to successfully determine profiles in both cemented and slotted liner completions with GORs in excess of 15,000 scf/bbl.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5001755','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5001755"><span>Predicting hospital visits from geo-tagged Internet search <span class="hlt">logs</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H.</p> <p>2016-01-01</p> <p>The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, <span class="hlt">2</span>) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search <span class="hlt">logs</span> could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search <span class="hlt">logs</span>, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user’s future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources—a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA523531','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA523531"><span>A Dynamically Configurable <span class="hlt">Log</span>-based Distributed Security Event Detection Methodology using Simple Event Correlator</span></a></p> <p><a target="_blank" href="https://publicaccess.dtic.mil/psm/api/service/search/search">DTIC Science & Technology</a></p> <p></p> <p>2010-06-01</p> <p>from SANS Whitepaper - "... Detecting Attacks on Web Applications from <span class="hlt">Log</span> Files" #look for image tags type=Single continue=TakeNext ptype=RegExp...shellcmd /home/user/sec -<span class="hlt">2</span>.5.3/ common/syslogclient "... Synthetic : " "$<span class="hlt">2</span>|$1|xss detected in image tag: $3" #send the raw <span class="hlt">log</span> type=Single ptype=RegExp...Expressions taken from SANS Whitepaper - "... Detecting Attacks on Web Applications from <span class="hlt">Log</span> Files" #look for image tags type=Single continue=TakeNext</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/910978','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/910978"><span>Selecting Aquifer Wells for Planned Gyroscopic <span class="hlt">Logging</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Rohe, Michael James; Studley, Gregory Wayne</p> <p>2002-04-01</p> <p>Understanding the configuration of the eastern Snake River Plain aquifer's water table is made difficult, in part, due to borehole deviation in aquifer wells. A borehole has deviation if it is not vertical or straight. Deviation impairs the analysis of water table elevation measurements because it results in measurements that are greater than the true distance from the top of the well to the water table. Conceptual models of the water table configuration are important to environmental management decision-making at the INEEL; these models are based on measurements of depth to the water table taken from aquifer wells at or near the INEEL. When accurate data on the amount of deviation in any given borehole is acquired, then measurements of depth-to-water can be adjusted to reflect the true depth so more accurate conceptual models can be developed. Collection of additional borehole deviation data with gyroscopic <span class="hlt">logging</span> is planned for selected wells to further our confidence in the quality of water level measurements. Selection of wells for the planned <span class="hlt">logging</span> is based on qualitative and quantitative screening criteria. An existing data set from magnetic deviation <span class="hlt">logs</span> was useful in establishing these criteria however, are considered less accurate than gyroscopic deviation <span class="hlt">logs</span> under certain conditions. Population distributions for 128 aquifer wells with magnetic deviation data were used to establish three quantitative screening thresholds. Qualitative criteria consisted of administrative controls, accessibility issues, and drilling methods. Qualitative criteria eliminated all but 116 of the 337 aquifer wells, in the vicinity of the INEEL, that were initially examined in this screening effort. Of these, 72 have associated magnetic deviation data; 44 do not. Twenty-five (25) of the 72 wells with magnetic deviation data have deviation greater than one of the three quantitative screening thresholds. These 25 are recommended for the planned gyroscopic borehole deviation</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3622666','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3622666"><span>Impacts of Intensive <span class="hlt">Logging</span> on the Trophic Organisation of Ant Communities in a Biodiversity Hotspot</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Woodcock, Paul; Edwards, David P.; Newton, Rob J.; Vun Khen, Chey; Bottrell, Simon H.; Hamer, Keith C.</p> <p>2013-01-01</p> <p>Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective <span class="hlt">logging</span> affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and <span class="hlt">logged</span> rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following <span class="hlt">logging</span>, (ii) the trophic positions of species are altered by <span class="hlt">logging</span>, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in <span class="hlt">logged</span> forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in <span class="hlt">logged</span> forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and <span class="hlt">logged</span> forest, supporting prediction (iii), and food chains were 0.<span class="hlt">2</span> trophic levels longer in <span class="hlt">logged</span> forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in <span class="hlt">logged</span> forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by <span class="hlt">logging</span></p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23593302','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23593302"><span>Impacts of intensive <span class="hlt">logging</span> on the trophic organisation of ant communities in a biodiversity hotspot.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Woodcock, Paul; Edwards, David P; Newton, Rob J; Vun Khen, Chey; Bottrell, Simon H; Hamer, Keith C</p> <p>2013-01-01</p> <p>Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective <span class="hlt">logging</span> affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and <span class="hlt">logged</span> rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following <span class="hlt">logging</span>, (ii) the trophic positions of species are altered by <span class="hlt">logging</span>, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in <span class="hlt">logged</span> forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in <span class="hlt">logged</span> forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and <span class="hlt">logged</span> forest, supporting prediction (iii), and food chains were 0.<span class="hlt">2</span> trophic levels longer in <span class="hlt">logged</span> forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in <span class="hlt">logged</span> forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by <span class="hlt">logging</span></p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19890064051&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dlog','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19890064051&hterms=log&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dlog"><span>The X-ray <span class="hlt">log</span> N-<span class="hlt">log</span> S relation. [background radiation in extragalactic media</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Boldt, Elihu</p> <p>1989-01-01</p> <p>Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated <span class="hlt">log</span> N-<span class="hlt">log</span> S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3716296','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3716296"><span>Empirical <span class="hlt">likelihood</span>-based tests for stochastic ordering</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>BARMI, HAMMOU EL; MCKEAGUE, IAN W.</p> <p>2013-01-01</p> <p>This paper develops an empirical <span class="hlt">likelihood</span> approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical <span class="hlt">likelihood</span> statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NatCo...814796E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NatCo...814796E"><span>Complex picture for <span class="hlt">likelihood</span> of ENSO-driven flood hazard</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Emerton, R.; Cloke, H. L.; Stephens, E. M.; Zsoter, E.; Woolnough, S. J.; Pappenberger, F.</p> <p>2017-03-01</p> <p>El Niño and La Niña events, the extremes of ENSO climate variability, influence river flow and flooding at the global scale. Estimates of the historical probability of extreme (high or low) precipitation are used to provide vital information on the <span class="hlt">likelihood</span> of adverse impacts during extreme ENSO events. However, the nonlinearity between precipitation and flood magnitude motivates the need for estimation of historical probabilities using analysis of hydrological data sets. Here, this analysis is undertaken using the ERA-20CM-R river flow reconstruction for the twentieth century. Our results show that the <span class="hlt">likelihood</span> of increased or decreased flood hazard during ENSO events is much more complex than is often perceived and reported; probabilities vary greatly across the globe, with large uncertainties inherent in the data and clear differences when comparing the hydrological analysis to precipitation.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/scitech/biblio/6974162','SCIGOV-STC'); return false;" href="https://www.osti.gov/scitech/biblio/6974162"><span>A <span class="hlt">likelihood</span> approach to calculating risk support intervals</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Leal, S.M.; Ott, J. )</p> <p>1994-05-01</p> <p>Genetic risks are usually computed under the assumption that genetic parameters, such as the recombination fraction, are known without error. Uncertainty in the estimates of these parameters must translate into uncertainty regarding the risk. To allow for uncertainties in parameter values, one may employ Bayesian techniques or, in a maximum-<span class="hlt">likelihood</span> framework, construct a support interval (SI) for the risk. Here the authors have implemented the latter approach. The SI for the risk is based on the SIs of parameters involved in the pedigree <span class="hlt">likelihood</span>. As an empirical example, the SI for the risk was calculated for probands who are members of chronic spinal muscular atrophy kindreds. In order to evaluate the accuracy of a risk in genetic counseling situations, the authors advocate that, in addition to a point estimate, an SI for the risk should be calculated. 16 refs., 1 fig., 1 tab.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23296487','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23296487"><span>Corrected profile <span class="hlt">likelihood</span> confidence interval for binomial paired incomplete data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pradhan, Vivek; Menon, Sandeep; Das, Ujjwal</p> <p>2013-01-01</p> <p>Clinical trials often use paired binomial data as their clinical endpoint. The confidence interval is frequently used to estimate the treatment performance. Tang et al. (2009) have proposed exact and approximate unconditional methods for constructing a confidence interval in the presence of incomplete paired binary data. The approach proposed by Tang et al. can be overly conservative with large expected confidence interval width (ECIW) in some situations. We propose a profile <span class="hlt">likelihood</span>-based method with a Jeffreys' prior correction to construct the confidence interval. This approach generates confidence interval with a much better coverage probability and shorter ECIWs. The performances of the method along with the corrections are demonstrated through extensive simulation. Finally, three real world data sets are analyzed by all the methods. Statistical Analysis System (SAS) codes to execute the profile <span class="hlt">likelihood</span>-based methods are also presented.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <center> <div class="footer-extlink text-muted"><small>Some links on this page may take you to non-federal websites. Their policies may differ from this site.</small> </div> </center> <div id="footer-wrapper"> <div class="footer-content"> <div id="footerOSTI" class=""> <div class="row"> <div class="col-md-4 text-center col-md-push-4 footer-content-center"><small><a href="http://www.science.gov/disclaimer.html">Privacy and Security</a></small> <div class="visible-sm visible-xs push_footer"></div> </div> <div class="col-md-4 text-center col-md-pull-4 footer-content-left"> <img src="https://www.osti.gov/images/DOE_SC31.png" alt="U.S. Department of Energy" usemap="#doe" height="31" width="177"><map style="display:none;" name="doe" id="doe"><area shape="rect" coords="1,3,107,30" href="http://www.energy.gov" alt="U.S. Deparment of Energy"><area shape="rect" coords="114,3,165,30" href="http://www.science.energy.gov" alt="Office of Science"></map> <a ref="http://www.osti.gov" style="margin-left: 15px;"><img src="https://www.osti.gov/images/footerimages/ostigov53.png" alt="Office of Scientific and Technical Information" height="31" width="53"></a> <div class="visible-sm visible-xs push_footer"></div> </div> <div class="col-md-4 text-center footer-content-right"> <a href="http://www.science.gov"><img src="https://www.osti.gov/images/footerimages/scigov77.png" alt="science.gov" height="31" width="98"></a> <a href="http://worldwidescience.org"><img src="https://www.osti.gov/images/footerimages/wws82.png" alt="WorldWideScience.org" height="31" width="90"></a> </div> </div> </div> </div> </div> <p><br></p> </div><!-- container --> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>