Science.gov

Sample records for 2 log likelihood

  1. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  2. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  3. Fast inference in generalized linear models via expected log-likelihoods.

    PubMed

    Ramirez, Alexandro D; Paninski, Liam

    2014-04-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting "expected log-likelihood" can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina.

  4. Fast inference in generalized linear models via expected log-likelihoods

    PubMed Central

    Ramirez, Alexandro D.; Paninski, Liam

    2015-01-01

    Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting “expected log-likelihood” can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina. PMID:23832289

  5. Posterior propriety for hierarchical models with log-likelihoods that have norm bounds

    SciTech Connect

    Michalak, Sarah E.; Morris, Carl N.

    2015-07-17

    Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonly used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).

  6. Posterior propriety for hierarchical models with log-likelihoods that have norm bounds

    DOE PAGES

    Michalak, Sarah E.; Morris, Carl N.

    2015-07-17

    Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less

  7. MICCLLR: Multiple-Instance Learning Using Class Conditional Log Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    El-Manzalawy, Yasser; Honavar, Vasant

    Multiple-instance learning (MIL) is a generalization of the supervised learning problem where each training observation is a labeled bag of unlabeled instances. Several supervised learning algorithms have been successfully adapted for the multiple-instance learning settings. We explore the adaptation of the Naive Bayes (NB) classifier and the utilization of its sufficient statistics for developing novel multiple-instance learning methods. Specifically, we introduce MICCLLR (multiple-instance class conditional log likelihood ratio), a method for mapping each bag of instances as a single meta-instance using class conditional log likelihood ratio statistics such that any supervised base classifier can be applied to the meta-data. The results of our experiments with MICCLLR using different base classifiers suggest that no single base classifier consistently outperforms other base classifiers on all data sets. We show that a substantial improvement in performance is obtained using an ensemble of MICCLLR classifiers trained using different base learners. We also show that an extra gain in classification accuracy is obtained by applying AdaBoost.M1 to weak MICCLLR classifiers. Overall, our results suggest that the predictive performance of the three proposed variants of MICCLLR are competitive to some of the state-of-the-art MIL methods.

  8. 2. Onroom log cabin (right), log root cellar (center), tworoom ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  9. A Planar Approximation for the Least Reliable Bit Log-likelihood Ratio of 8-PSK Modulation

    NASA Technical Reports Server (NTRS)

    Thesling, William H.; Vanderaar, Mark J.

    1994-01-01

    The optimum decoding of component codes in block coded modulation (BCM) schemes requires the use of the log-likelihood ratio (LLR) as the signal metric. An approximation to the LLR for the least reliable bit (LRB) in an 8-PSK modulation based on planar equations with fixed point arithmetic is developed that is both accurate and easily realizable for practical BCM schemes. Through an error power analysis and an example simulation it is shown that the approximation results in 0.06 dB in degradation over the exact expression at an E(sub s)/N(sub o) of 10 dB. It is also shown that the approximation can be realized in combinatorial logic using roughly 7300 transistors. This compares favorably to a look up table approach in typical systems.

  10. A log-likelihood-gain intensity target for crystallographic phasing that accounts for experimental error.

    PubMed

    Read, Randy J; McCoy, Airlie J

    2016-03-01

    The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978), Acta Cryst. A35, 517-525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124

  11. A log-likelihood-gain intensity target for crystallographic phasing that accounts for experimental error

    PubMed Central

    Read, Randy J.; McCoy, Airlie J.

    2016-01-01

    The crystallographic diffraction experiment measures Bragg intensities; crystallo­graphic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978 ▸), Acta Cryst. A35, 517–525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124

  12. Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density

    PubMed Central

    Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A.

    2009-01-01

    We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f0 = exp ϕ0 where ϕ0 is a concave function on ℝ. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log–concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, ∞) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of Hk, the “lower invelope” of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of ϕ0 = log f0 at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f0) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values. PMID:19881896

  13. Identification of contemporary selection signatures using composite log likelihood and their associations with marbling score in Korean cattle.

    PubMed

    Ryu, Jihye; Lee, Chaeyoung

    2014-12-01

    Positive selection not only increases beneficial allele frequency but also causes augmentation of allele frequencies of sequence variants in close proximity. Signals for positive selection were detected by the statistical differences in subsequent allele frequencies. To identify selection signatures in Korean cattle, we applied a composite log-likelihood (CLL)-based method, which calculates a composite likelihood of the allelic frequencies observed across sliding windows of five adjacent loci and compares the value with the critical statistic estimated by 50,000 permutations. Data for a total of 11,799 nucleotide polymorphisms were used with 71 Korean cattle and 209 foreign beef cattle. As a result, 147 signals were identified for Korean cattle based on CLL estimates (P < 0.01). The signals might be candidate genetic factors for meat quality by which the Korean cattle have been selected. Further genetic association analysis with 41 intragenic variants in the selection signatures with the greatest CLL for each chromosome revealed that marbling score was associated with five variants. Intensive association studies with all the selection signatures identified in this study are required to exclude signals associated with other phenotypes or signals falsely detected and thus to identify genetic markers for meat quality.

  14. Approximate and Pseudo-Likelihood Analysis for Logistic Regression Using External Validation Data to Model Log Exposure

    PubMed Central

    KUPPER, Lawrence L.

    2012-01-01

    A common goal in environmental epidemiologic studies is to undertake logistic regression modeling to associate a continuous measure of exposure with binary disease status, adjusting for covariates. A frequent complication is that exposure may only be measurable indirectly, through a collection of subject-specific variables assumed associated with it. Motivated by a specific study to investigate the association between lung function and exposure to metal working fluids, we focus on a multiplicative-lognormal structural measurement error scenario and approaches to address it when external validation data are available. Conceptually, we emphasize the case in which true untransformed exposure is of interest in modeling disease status, but measurement error is additive on the log scale and thus multiplicative on the raw scale. Methodologically, we favor a pseudo-likelihood (PL) approach that exhibits fewer computational problems than direct full maximum likelihood (ML) yet maintains consistency under the assumed models without necessitating small exposure effects and/or small measurement error assumptions. Such assumptions are required by computationally convenient alternative methods like regression calibration (RC) and ML based on probit approximations. We summarize simulations demonstrating considerable potential for bias in the latter two approaches, while supporting the use of PL across a variety of scenarios. We also provide accessible strategies for obtaining adjusted standard errors to accompany RC and PL estimates. PMID:24027381

  15. 2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  16. log(MPl/m3/2)

    SciTech Connect

    Loaiza-Brito, Oscar; Martin, Johannes; Nilles, Hans Peter; Ratz, Michael

    2005-12-02

    Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-) which corresponds to the scale of gaugino condensation. Under rather general circumstances, this leads to a scenario of moduli stabilization, which is endowed with a hierarchy between the mass of the lightest modulus, the gravitino mass and the scale of the soft terms, mmodulus {approx} m3/2 {approx} 2 msoft. The 'little hierarchy' is given by the logarithm of the ratio of the Planck scale and the gravitino mass, {approx} log(MPl/m3/2) {approx} 4{pi}2. This exhibits a new mediation scheme of supersymmetry breakdown, called mirage mediation. We highlight the special properties of the scheme, and their consequences for phenomenology and cosmology.

  17. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  18. Experiment 2030. EE-2 Temperature Log and Downhole Water Sample

    SciTech Connect

    Grigsby, Charles O.

    1983-07-29

    A temperature log and downhole water sample run were conducted in EE-2 on July 13, 1983. The temperature log was taken to show any changes which had occurred in the fracture-to-wellbore intersections as a result of the Experiment 2020 pumping and to locate fluid entries for taking the water sample. The water sample was requested primarily to determine the arsenic concentration in EE-2 fluids (see memo from C.Grigsby, June 28, 1983 concerning arsenic in EE-3 samples.) The temperature log was run using the thermistor in the ESS-6 water samples.

  19. Core log: Valles caldera No. 2A, New Mexico

    SciTech Connect

    Starguist, V.L.

    1988-01-01

    Scientific core hole VC-2A was drilled into the western ring-fracture zone at Sulphur Springs in the Valles caldera, New Mexico. VC-2A, the second scientific core hole in the caldera, was cored through a faulted and brecciated sequence of intracauldron tuffs and volcaniclastic rocks to a depth of 528 m. As of November 1, 1986, the unequilibrated bottom-hole temperature was 212/degree/C. The rocks penetrated are intensely altered and host sub-ore grade stockwork molybdenite mineralization between 25 and 125 m. This report contains a detailed core log to aid researchers in their studies of the Valles caldera magma hydrothermal system. 3 refs., 2 figs.

  20. A note on G2 log-aesthetic curves

    NASA Astrophysics Data System (ADS)

    Wo, Mei Seen; Gobithaasan R., U.; Miura, Kenjiro T.; Abbas, Muhammad

    2015-12-01

    Log-aesthetic curve (LAC) is a curve family composed of transcendental curves that includes logarithmic spiral, clothoid, circle involute and Nielsen's spiral. They have linear logarithmic curvature graphs (LCGs) and are highly aesthetic. In order to implement G2 LAC in industrial design successfully, one needs guidance on the existence and uniqueness whether a LAC segment satisfy given G2 Hermite data. This paper focuses shows the existence and uniqueness of solution for single segment G2 LAC. A LAC equation that incorporates both start and end curvatures, and end tangential angle is first derived. Then, the end points of the LAC segments are calculated using the derived LAC equation, which is also a representation of the solution region of LAC given a set of G2 Hermite data. The derived function is investigated for its existence and uniqueness. It is shown that the solution region is a curve that do not self-intersect anywhere, thus the solution of single segment G2 LAC is always unique.

  1. A complete X-ray sample of the high latitude sky from HEAO-1 A-2: log N lo S and luminosity functions

    NASA Technical Reports Server (NTRS)

    Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.

    1981-01-01

    An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.

  2. Maximum likelihood.

    PubMed

    Yang, Shuying; De Angelis, Daniela

    2013-01-01

    The maximum likelihood method is a popular statistical inferential procedure widely used in many areas to obtain the estimates of the unknown parameters of a population of interest. This chapter gives a brief description of the important concepts underlying the maximum likelihood method, the definition of the key components, the basic theory of the method, and the properties of the resulting estimates. Confidence interval and likelihood ratio test are also introduced. Finally, a few examples of applications are given to illustrate how to derive maximum likelihood estimates in practice. A list of references to relevant papers and software for a further understanding of the method and its implementation is provided.

  3. 2-D impulse noise suppression by recursive gaussian maximum likelihood estimation.

    PubMed

    Chen, Yang; Yang, Jian; Shu, Huazhong; Shi, Luyao; Wu, Jiasong; Luo, Limin; Coatrieux, Jean-Louis; Toumoulin, Christine

    2014-01-01

    An effective approach termed Recursive Gaussian Maximum Likelihood Estimation (RGMLE) is developed in this paper to suppress 2-D impulse noise. And two algorithms termed RGMLE-C and RGMLE-CS are derived by using spatially-adaptive variances, which are respectively estimated based on certainty and joint certainty & similarity information. To give reliable implementation of RGMLE-C and RGMLE-CS algorithms, a novel recursion stopping strategy is proposed by evaluating the estimation error of uncorrupted pixels. Numerical experiments on different noise densities show that the proposed two algorithms can lead to significantly better results than some typical median type filters. Efficient implementation is also realized via GPU (Graphic Processing Unit)-based parallelization techniques.

  4. Geomechanical Reservoir Characterization for Prediction of the Likelihood of Faulting Induced by CO2 Sequestration

    NASA Astrophysics Data System (ADS)

    Chiaramonte, L.; Zoback, M.; Friedmann, J.; Stamp, V.

    2006-12-01

    Geologic sequestration of CO2 has been proposed as a key technology for reducing greenhouse gas in the atmosphere. However, leakage of CO2 from any potential reservoir could seriously compromise the goal of long-term storage. Therefore understanding the likelihood of leakage is a key step toward the feasibility of this technology. Because it has long been known that fluid injection can lead to fault reactivation, assessing the potential of induced fault slip and leakage of CO2 from any potential storage reservoir is critical prior to injection of large quantities of CO2. We have developed a geomechanical model in advance of a proposed injection experiment at Teapot Dome, WY in order to study in detail the processes of a CO2 leak through predicted faults. Teapot Dome is an elongated asymmetrical, basement-cored anticline with a north-northeast axis. It is part of the Salt Creek structural trend, located in the southwestern edge of the Powder River Basin. The anticline is compartmentalized in a series of oblique-slip faults, generally oriented NE-SW, although varying in geometry, displacement, and complexity. In particular, the fault network referred to as S2 shows a greater complexity and a range of geometries and azimuths that enhances its potential for leakage. Furthermore its surface expression show alkali springs and hydrocarbon samples within the fault veins and gouge. The S2 fault intersects the oil-bearing 2nd Wall Creek around 650m, which is close in terms of temperature and pressure to conditions of supercritical CO2 phase injection and oil miscibility. We developed a preliminary geomechanical model of the deeper Tensleep Fm. previously, utilizing observations of wellbore breakouts, drilling induced fractures, leak-off or mini-frac tests and other available geophysical data. We used this information to evaluate the potential for injection to induce slip on the S1 fault, approximately 2 km south of the present study area. These results are compared and

  5. List-mode likelihood: EM algorithm and image quality estimation demonstrated on 2-D PET.

    PubMed

    Parra, L; Barrett, H H

    1998-04-01

    Using a theory of list-mode maximum-likelihood (ML) source reconstruction presented recently by Barrett et al., this paper formulates a corresponding expectation-maximization (EM) algorithm, as well as a method for estimating noise properties at the ML estimate. List-mode ML is of interest in cases where the dimensionality of the measurement space impedes a binning of the measurement data. It can be advantageous in cases where a better forward model can be obtained by including more measurement coordinates provided by a given detector. Different figures of merit for the detector performance can be computed from the Fisher information matrix (FIM). This paper uses the observed FIM, which requires a single data set, thus, avoiding costly ensemble statistics. The proposed techniques are demonstrated for an idealized two-dimensional (2-D) positron emission tomography (PET) [2-D PET] detector. We compute from simulation data the improved image quality obtained by including the time of flight of the coincident quanta.

  6. Likelihood ratio-based integrated personal risk assessment of type 2 diabetes.

    PubMed

    Sato, Noriko; Htun, Nay Chi; Daimon, Makoto; Tamiya, Gen; Kato, Takeo; Kubota, Isao; Ueno, Yoshiyuki; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa; Muramatsu, Masaaki

    2014-01-01

    To facilitate personalized health care for multifactorial diseases, risks of genetic and clinical/environmental factors should be assessed together for each individual in an integrated fashion. This approach is possible with the likelihood ratio (LR)-based risk assessment system, as this system can incorporate manifold tests. We examined the usefulness of this system for assessing type 2 diabetes (T2D). Our system employed 29 genetic susceptibility variants, body mass index (BMI), and hypertension as risk factors whose LRs can be estimated from openly available T2D association data for the Japanese population. The pretest probability was set at a sex- and age-appropriate population average of diabetes prevalence. The classification performance of our LR-based risk assessment was compared to that of a non-invasive screening test for diabetes called TOPICS (with score based on age, sex, family history, smoking, BMI, and hypertension) using receiver operating characteristic analysis with a community cohort (n = 1263). The area under the receiver operating characteristic curve (AUC) for the LR-based assessment and TOPICS was 0.707 (95% CI 0.665-0.750) and 0.719 (0.675-0.762), respectively. These AUCs were much higher than that of a genetic risk score constructed using the same genetic susceptibility variants, 0.624 (0.574-0.674). The use of ethnically matched LRs is necessary for proper personal risk assessment. In conclusion, although LR-based integrated risk assessment for T2D still requires additional tests that evaluate other factors, such as risks involved in missing heritability, our results indicate the potential usability of LR-based assessment system and stress the importance of stratified epidemiological investigations in personalized medicine. PMID:25069673

  7. In situ fluid typing and quantification with 1D and 2D NMR logging.

    PubMed

    Sun, Boqin

    2007-05-01

    In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples. PMID:17466778

  8. Soy isoflavone intake and the likelihood of ever becoming a mother: the Adventist Health Study-2

    PubMed Central

    Jacobsen, Bjarne K; Jaceldo-Siegl, Karen; Knutsen, Synnøve F; Fan, Jing; Oda, Keiji; Fraser, Gary E

    2014-01-01

    Objectives As little is known about the possible relationship between the intake of phytoestrogens and female fertility, we investigated the relationship between soy isoflavone intake and the risk of nulliparity and nulligravidity. Methods A cross-sectional study of 11,688 North American Adventist women aged 30–50 years old with data regarding childbearing. These women were, as a group, characterized by a high proportion (54%) of vegetarians and a healthy lifestyle with a very low prevalence of smoking and alcohol use. Results The mean isoflavone intake (17.9 mg per day) was very high compared to other Western populations. Only 6% of the women indicated no intake of isoflavones. We found, after adjustment for age, marital status, and educational level, an inverse relationship (P=0.05) between isoflavone intake and the likelihood of ever having become a mother. In women with high (≥40 mg/day) isoflavone intake (12% of this group of women), the adjusted lifetime probability of giving birth to a live child was reduced by approximately 3% (95% CI: 0, 7) compared to women with low (<10 mg/day) intake. No relationships were found between the isoflavone intake and parity or age at first delivery in parous women. A similar inverse relationship (P=0.03) was found between the isoflavone intake and the risk of nulligravidity with a 13% (95% CI: 2, 26) higher risk of never have been pregnant in women with high (≥40 mg/day) isoflavone intake. These relationships were found mainly in women who reported problems becoming pregnant. Conclusion The findings suggest that a high dietary isoflavone intake may have significant impact on fertility. PMID:24741329

  9. Optimization of a Nucleic Acids united-RESidue 2-Point model (NARES-2P) with a maximum-likelihood approach

    SciTech Connect

    He, Yi; Scheraga, Harold A.; Liwo, Adam

    2015-12-28

    Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.

  10. Optimization of a Nucleic Acids united-RESidue 2-Point model (NARES-2P) with a maximum-likelihood approach

    NASA Astrophysics Data System (ADS)

    He, Yi; Liwo, Adam; Scheraga, Harold A.

    2015-12-01

    Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.

  11. The phylogenetic likelihood library.

    PubMed

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL).

  12. Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods

    PubMed Central

    McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando

    2010-01-01

    Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349

  13. Basalt CO2 Sequestration: Using Wireline Logs to Identify Subsurface Continental Flood Basalt Lithofacies

    NASA Astrophysics Data System (ADS)

    Sullivan, E. C.; Finn, S.; Davis, K. N.; Segovia, A. I.

    2010-12-01

    The flows of the Miocene Columbia River Basalt Group (CRBG) of the northwest United States are an important example of reactive flood basalts that are attractive targets for sequestration of anthropogenic carbon dioxide. Brecciated flow tops and dense flow interiors form layered regional aquifer systems in the Columbia Basin that have the potential to sequester gigatons of supercritical CO2 where they contain non-potable water and are at depths of greater than 800m. The demonstrated chemical reactivity of these continental flood basalts with supercritical CO2 in laboratory experiments suggests that part of the sequestered CO2 will be permanently entombed as carbonate minerals. Here we report on the use of conventional wire-line log data, along with full waveform sonic and resistivity-based image logs to identify subsurface basalt stratigraphy and lithofacies relevant to CO2 sequestration. We compare borehole data from the 2009 Big Sky Carbon Sequestration Partnership basalt pilot well near Wallula, Washington U.S.A. with regional outcrop analogs to determine patterns for recognizing basalt lithofacies in the subsurface. We examine quick-look techniques recently proposed for hydrocarbon exploration in basalt terranes and show that rescaled shear and compressional sonic log curves, which reflect changes in bulk modulus, appear to provide a robust tool for the identification of subsurface CRBG basalt lithofacies Resistivity-based Image Log of Vesicular Basalt and Fractures From the Wallula Basalt Pilot Well

  14. The Likelihood Function and Likelihood Statistics

    NASA Astrophysics Data System (ADS)

    Robinson, Edward L.

    2016-01-01

    The likelihood function is a necessary component of Bayesian statistics but not of frequentist statistics. The likelihood function can, however, serve as the foundation for an attractive variant of frequentist statistics sometimes called likelihood statistics. We will first discuss the definition and meaning of the likelihood function, giving some examples of its use and abuse - most notably in the so-called prosecutor's fallacy. Maximum likelihood estimation is the aspect of likelihood statistics familiar to most people. When data points are known to have Gaussian probability distributions, maximum likelihood parameter estimation leads directly to least-squares estimation. When the data points have non-Gaussian distributions, least-squares estimation is no longer appropriate. We will show how the maximum likelihood principle leads to logical alternatives to least squares estimation for non-Gaussian distributions, taking the Poisson distribution as an example.The likelihood ratio is the ratio of the likelihoods of, for example, two hypotheses or two parameters. Likelihood ratios can be treated much like un-normalized probability distributions, greatly extending the applicability and utility of likelihood statistics. Likelihood ratios are prone to the same complexities that afflict posterior probability distributions in Bayesian statistics. We will show how meaningful information can be extracted from likelihood ratios by the Laplace approximation, by marginalizing, or by Markov chain Monte Carlo sampling.

  15. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false How soon after I know of the likelihood of... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a) You... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  16. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false How soon after I know of the likelihood of... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a) You... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  17. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false How soon after I know of the likelihood of... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a) You... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  18. 43 CFR 3162.2-11 - How soon after I know of the likelihood of drainage must I take protective action?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false How soon after I know of the likelihood of... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a) You... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...

  19. Final Report for Dynamic Models for Causal Analysis of Panel Data. Quality of Maximum Likelihood Estimates of Parameters in a Log-Linear Rate Model. Part III, Chapter 3.

    ERIC Educational Resources Information Center

    Fennell, Mary L.; And Others

    This document is part of a series of chapters described in SO 011 759. This chapter reports the results of Monte Carlo simulations designed to analyze problems of using maximum likelihood estimation (MLE: see SO 011 767) in research models which combine longitudinal and dynamic behavior data in studies of change. Four complications--censoring of…

  20. Comprehensive study of LASL Well C/T-2 Roosevelt Hot Springs KGRA, Utah, and applications to geothermal well logging

    SciTech Connect

    Glenn, W.E.; Hulen, J.B.; Nielson, D.L.

    1981-02-01

    Utah State Geothermal Well 9-1 in the Roosevelt Hot Springs KGRA, Beaver County, Utah, has been donated by Phillips Petroleum Company for calibration and testing of well-logging equipment in the hot, corrosive, geothermal environment. It is the second Calibration/Test Well (C/T-2) in the Geothermal Log Interpretation Program. A study of cuttings and well logs from Well C/T-2 was completed. This synthesis and data presentation contains most of the subsurface geologic information needed to effect the total evaluation of geophysical logs acquired in this geothermal calibration/test well, C/T-2.

  1. A new inversion method for (T2, D) 2D NMR logging and fluid typing

    NASA Astrophysics Data System (ADS)

    Tan, Maojin; Zou, Youlong; Zhou, Cancan

    2013-02-01

    One-dimensional nuclear magnetic resonance (1D NMR) logging technology has some significant limitations in fluid typing. However, not only can two-dimensional nuclear magnetic resonance (2D NMR) provide some accurate porosity parameters, but it can also identify fluids more accurately than 1D NMR. In this paper, based on the relaxation mechanism of (T2, D) 2D NMR in a gradient magnetic field, a hybrid inversion method that combines least-squares-based QR decomposition (LSQR) and truncated singular value decomposition (TSVD) is examined in the 2D NMR inversion of various fluid models. The forward modeling and inversion tests are performed in detail with different acquisition parameters, such as magnetic field gradients (G) and echo spacing (TE) groups. The simulated results are discussed and described in detail, the influence of the above-mentioned observation parameters on the inversion accuracy is investigated and analyzed, and the observation parameters in multi-TE activation are optimized. Furthermore, the hybrid inversion can be applied to quantitatively determine the fluid saturation. To study the effects of noise level on the hybrid method and inversion results, the numerical simulation experiments are performed using different signal-to-noise-ratios (SNRs), and the effect of different SNRs on fluid typing using three fluid models are discussed and analyzed in detail.

  2. Application of ALOGPS 2.1 to predict log D distribution coefficient for Pfizer proprietary compounds.

    PubMed

    Tetko, Igor V; Poda, Gennadiy I

    2004-11-01

    Evaluation of the ALOGPS, ACD Labs LogD, and PALLAS PrologD suites to calculate the log D distribution coefficient resulted in high root-mean-squared error (RMSE) of 1.0-1.5 log for two in-house Pfizer's log D data sets of 17,861 and 640 compounds. Inaccuracy in log P prediction was the limiting factor for the overall log D estimation by these algorithms. The self-learning feature of the ALOGPS (LIBRARY mode) remarkably improved the accuracy in log D prediction, and an rmse of 0.64-0.65 was calculated for both data sets.

  3. Augmented Likelihood Image Reconstruction.

    PubMed

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  4. Realtime logging

    SciTech Connect

    Whittaker, A.; Kashuba, M.J.

    1987-01-01

    This article reports that measurement-while-drilling (MWD) logging services have become an important new source of drilling and geological information on wildcats and some development wells. Sensors located within the bottomhole assembly, barely a few feet above the bit, make measurements on the formation, the borehole and the drill string itself. The MWD measurements are electronically processed and stored in the logging tool downhole. Simple MWD logging systems must wait until after tripping out of the hole for the MWD data to be downloaded from the logging tool to a surface computer in order for logs to be produced. This method is acceptable for some formation evaluation problems. But when well control, directional or completion decisions must be made, the benefit of MWD logging data is obtained only if the downhole measurements are available to the engineer in realtime.

  5. Well Log ETL tool

    SciTech Connect

    Good, Jessica

    2013-08-01

    This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.

  6. Analysis of geophysical well logs obtained in the State 2-14 borehole, Salton Sea geothermal area, California

    USGS Publications Warehouse

    Paillet, Frederick L.; Morin, R.H.

    1988-01-01

    A complete suite of conventional geophysical well logs was obtained in the upper part of a 3220-m-deep borehole drilled into geothermally altered alluvial sediments on the southeastern edge of the Salton Sea. Geophysical logs obtained in the State 2-14 borehole indicate that neutron porosity, gamma-gamma, and deep-induction logs provide useful information on lithologic trends with depth. The natural gamma log contains almost continuous, high-frequency fluctuations that obscure lithologic trends and that may be related to recent radioisotope redistribution and departure from radiometric equilibrium. Acoustic transit time logs give unrealistically low in situ compressional velocities ranging from 1.8 to 3.0 km/s, whereas acoustic waveform logs indicate that sediment compressional velocities range from less than 3.0 km/s shallower than 1000 m in depth to almost 5.0 km/s at depths greater than 2000 m. Analyses indicate that most log values lie between two lithologic end points: an electrically conductive claystone with moderate neutron porosity, but no effective porosity, and an electrically nonconductive, fully cemented siltstone that has small but finite porosity. -from Authors

  7. Transaction Logging.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…

  8. Log N-log S in inconclusive

    NASA Technical Reports Server (NTRS)

    Klebesadel, R. W.; Fenimore, E. E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  9. Log N-log S is inconclusive

    SciTech Connect

    Klebesadel, R.W.; Fenimore, E.E.; Laros, J.

    1983-01-01

    The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.

  10. Condition and fate of logged forests in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M

    2006-08-22

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained.

  11. The Tumor-Log Odds of Positive Lymph Nodes-Metastasis Staging System, a Promising New Staging System for Gastric Cancer after D2 Resection in China

    PubMed Central

    Wang, Zhi-qiang; Ren, Chao; Wang, De-shen; Zhang, Dong-sheng; Luo, Hui-yan; Li, Yu-hong; Xu, Rui-hua

    2012-01-01

    Background In this study, we established a hypothetical tumor-lodds-metastasis (TLM) and tumor-ratio-metastasis (TRM) staging system. Moreover, we compared them with the 7th edition of American Joint Committee on Cancer tumor-nodes-metastasis (AJCC TNM) staging system in gastric cancer patients after D2 resection. Methods A total of 1000 gastric carcinoma patients receiving treatment in our center were selected for the analysis. Finally, 730 patients who received D2 resection were retrospectively studied. Patients were staged using the TLM, TRM and the 7th edition AJCC TNM system. Survival analysis was performed with a Cox regression model. We used two parameters to compare the TNM, TRM and TLM staging system, the −2log likelihood and the hazard ratio. Results The cut points of lymph node ratio (LNR) were set as 0, 0–0.3, 0.3–0.6, 0.6–1.0. And for the log odds of positive lymph nodes (LODDS), the cut points were established as≤−0.5, −0.5-0, 0-0.5, >0.5. There were significant differences in survival among patients in different LODDS classifications for each pN or LNR groups. When stratified by the LODDS classifications, the prognosis was highly homologous between those in the according pN or LNR classifications. Multivariate analysis showed that TLM staging system was better than the TRM or TNM system for the prognostic evaluation. Conclusions The TLM system was superior to the TRM or TNM system for prognostic assessment of gastric adenocarcinoma patients after D2 resection. PMID:22348125

  12. Ulysses log 1992

    NASA Technical Reports Server (NTRS)

    Perez, Raul Garcia

    1993-01-01

    The Ulysses Log tells the story of some intriguing problems that we (=The Spacecraft Team) have encountered. Ulysses was launched on 6 Oct. 1990, and it made the fastest trip to Jupiter (8 Feb. 1992). It is presently going out of the ecliptic. This paper presents log entries from the following areas: (1) ingenious maneuvers; (2) telecommunication problems; and (3) surprises.

  13. Preliminary observations and logs of BARB 1 and BARB 2: komatiites from the Tjakastad site

    NASA Astrophysics Data System (ADS)

    Coetzee, Grace; Arndt, Nicholas; Wilson, Allan

    2013-04-01

    The BARB 1 and BARB 2 cores intersect a suite of komatiite flows and komatiitic basalts as well as fragmental rocks of the Komati Formation of the Onverwacht Group, Barberton Greenstone Belt. The cores give important and previously unattainable information on the structures, textures and contact relationships between individual komatiite flows and different lithological units within the flows. BARB 1 was drilled at -48° on a 5° azimuth to a depth of 419.9 m. This core contains a unique volcanic tumulus succession in the stratigraphically lower 100 m and the rest of the core consists of about 59 flows of spinifex-textured komatiite (1-3 m thick), massive komatiite (0.5-10 m thick), komatiitic basalt (1-9 m thick) and a single basalt layer (10 m thick), intruded by gabbro (0.5-2 m thick) and a single dolerite dyke (18 m thick). BARB 2, approximately 50 m from BARB 1 and parallel to it, was drilled at -45°on an 8° azimuth to a depth of 431.5 m. This core contains approximately 39 flows of komatiite (0.5-10 m thick) and komatiitic basalt (2-23 m thick) which contain possible selvages of pillows. Basalt flows are more numerous (0.3-4 m thick) in BARB 2 whilst gabbro (0.6-7 m thick) is less prevalent. The dolerite dyke observed in BARB 1 does not occur in BARB 2. As the Barberton strata young towards the east, the cores intersected the stratigraphy in a reverse sequence. The cores were drilled such that there exists a 141 m overlap in stratigraphy between them. The section 141 m from the base of BARB 1 should theoretically correlate with the top 141 m of BARB 2. However, this overlap is not evident in the core or in the core logs. A single gabbro layer appears to be lithologically correlatable between both holes. There is no apparent correlation between the pattern of the komatiite flows leading to an initial conclusion that the komatiite flows were not laterally extensive or changed laterally in form over short distances. In both cores the proportion of komatiitic

  14. CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    2004-01-01

    The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.

  15. Log-Tool

    2012-05-21

    Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input – it will input data from files, standard input, and syslog, (2) Parser – it will parse the logmore » file based on regular expressions into structured data (JSNO format), (3) Output – it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.« less

  16. Perceiving cancer-risks and heredity-likelihood in genetic-counseling: how counselees recall and interpret BRCA 1/2-test results.

    PubMed

    Vos, J; Oosterwijk, J C; Gómez-García, E; Menko, F H; Jansen, A M; Stoel, R D; van Asperen, C J; Tibben, A; Stiggelbout, A M

    2011-03-01

    Previous studies on the counsellees' perception of DNA test results did not clarify whether counsellees were asked about their recollections or interpretations, and focused only on patients' own risks and not on the likelihood that cancer is heritable in the family. We tested differences and correlations of four perception aspects: recollections and interpretations of both cancer risks and heredity likelihood. In a retrospective study, women tested for BRCA1/2 on average, 5 years ago, completed questionnaires about their perception. Participants had received an unclassified variant (n = 76), uninformative (n = 76) or pathogenic mutation (n = 51) result in BRCA1/2. Analyses included t-tests, correlations and structural equation modelling. The counsellees' perception showed to consist of four distinctive phenomena: recollections and interpretations of cancer risks and of heredity likelihood. This distinctiveness was suggested by significant differences between these perception variables. Moderate to strong correlations were found between these variables, suggesting that these differences between variables were consistent. The relationships between these variables were not influenced by actually communicated DNA test results, sociodemographics, medical and pedigree information, or framing of cancer risk questions. The largest differences between recollections and interpretations were found in the unclassified variant group and the smallest in uninformatives. Cancer risks and heredity likelihood correlated least in the pathogenic mutation group. Communication of ambiguous genetic information enlarged the differences. To understand the counsellees' perception of genetic counselling, researchers should study recollections and interpretations of cancer risks and heredity likelihood. Genetic counsellors should explicitly address the counsellees' recollections and interpretations, and be aware of possible inaccuracies.

  17. Inductive resistivity logging in steel-cased boreholes. SBIR Phase 2 progress report

    SciTech Connect

    Wilt, M.

    1999-07-01

    SBIR Phase 2 project 40145-97-I calls for the design and construction of a prototype inductive logging device to measure formation resistivity from within a steel-cased borehole. The SCIL (Steel Casing Induction Logger) tool is intended for reservoir characterization and process monitoring in an oil field environment. This report summarizes findings from the initial project period. In this phase, bench model measurements were made to test casing compensation schemes, numerical models were calculated to optimize the tool configuration and associated formation sensitivity and the preliminary design of the tool was completed. The bench tests constitute fundamental research on determining the characteristics of steel well casing and on developing means of separating the effects of the casing and the formation. This technology is crucial to the success of the project and significant progress has been made towards the goal of recovering the formation resistivity from inside the casing. Next, a series of sensitivity and tool configuration studies have been completed through partner Dr. David Alumbaugh at Sandia National Laboratories. These numerical results help to optimize the tool configuration and allow one to calculate the expected formation sensitivity. These models are preliminary to data interpretation software to be developed in the next project period. The initial hardware design of the tool has been completed, and ordering parts has begun for later manufacture and assembly. The tool, which is designed for maximum flexibility of deployment, will have a powerful transmitter, an array of three component sensors and sufficient dynamic range to operate in standard oil field steel-cased boreholes.

  18. A new high-precision borehole-temperature logging system used at GISP2, Greenland, and Taylor Dome, Antarctica

    USGS Publications Warehouse

    Clow, G.D.; Saltus, R.W.; Waddington, E.D.

    1996-01-01

    We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.

  19. Addendum 2: Logs of monitor wells drilled May 1988 through December 1992

    SciTech Connect

    Stout, J.; Qualheim, B.; McPherrin, R.; Barber, K.; Hedegaard, R.; McConihe, W.; Miller, T.

    1993-11-01

    The logs in this addendum were plotted in a new format by the same software package (LOGGER by Rockware, Denver, CO) that was used in the original publication. The scale remains the same, 1 inch = 15 foot. The header is totally automated with a subheading indexing the well-construction symbols. Geophysical curves are labeled in their respective channels, and percentage core recovery is plotted in a histogram. Lithologic symbols are plotted to scale in a channel similar to previous logs. The lithologic description also has been automated to assure consistency in terminology. Descriptions are more extensive and are referenced by leader lines to the lithologic symbol. Additional figures included for this Addendum are: a plot of all the monitoring well locations at the LLNL Main site and a plot detailing the gasoline spill area well locations in the vicinity of Building 403.

  20. EMPIRICAL LIKELIHOOD INFERENCE FOR THE COX MODEL WITH TIME-DEPENDENT COEFFICIENTS VIA LOCAL PARTIAL LIKELIHOOD

    PubMed Central

    Sun, Yanqing; Sundaram, Rajeshwari; Zhao, Yichuan

    2009-01-01

    The Cox model with time-dependent coefficients has been studied by a number of authors recently. In this paper, we develop empirical likelihood (EL) pointwise confidence regions for the time-dependent regression coefficients via local partial likelihood smoothing. The EL simultaneous confidence bands for a linear combination of the coefficients are also derived based on the strong approximation methods. The empirical likelihood ratio is formulated through the local partial log-likelihood for the regression coefficient functions. Our numerical studies indicate that the EL pointwise/simultaneous confidence regions/bands have satisfactory finite sample performances. Compared with the confidence regions derived directly based on the asymptotic normal distribution of the local constant estimator, the EL confidence regions are overall tighter and can better capture the curvature of the underlying regression coefficient functions. Two data sets, the gastric cancer data and the Mayo Clinic primary biliary cirrhosis data, are analyzed using the proposed method. PMID:19838322

  1. A practical O(n log2 n) time algorithm for computing the triplet distance on binary trees.

    PubMed

    Sand, Andreas; Brodal, Gerth Stølting; Fagerberg, Rolf; Pedersen, Christian N S; Mailund, Thomas

    2013-01-01

    The triplet distance is a distance measure that compares two rooted trees on the same set of leaves by enumerating all sub-sets of three leaves and counting how often the induced topologies of the tree are equal or different. We present an algorithm that computes the triplet distance between two rooted binary trees in time O (n log2 n). The algorithm is related to an algorithm for computing the quartet distance between two unrooted binary trees in time O (n log n). While the quartet distance algorithm has a very severe overhead in the asymptotic time complexity that makes it impractical compared to O (n2) time algorithms, we show through experiments that the triplet distance algorithm can be implemented to give a competitive wall-time running time.

  2. Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data

    USGS Publications Warehouse

    King, K.C.

    1979-01-01

    The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well.  The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map).  Lithologic data are summarized from the sample descriptions of Smith and others (1976).  Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool.  Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.

  3. Grid Logging: Best Practices Guide

    SciTech Connect

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  4. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    A general iterative procedure is given for determining the consistent maximum likelihood estimates of normal distributions. In addition, a local maximum of the log-likelihood function, Newtons's method, a method of scoring, and modifications of these procedures are discussed.

  5. Microcomputer log analysis system

    SciTech Connect

    Ostrander, C.

    1984-04-01

    A comprehensive friendly log analysis system for use on a microcomputer requires only average log analysis skills. Most systems require both log analysis and computer professional for operation. This one has many capabilities: (1) data entry is handled by office personnel after minimal training; (2) entered data is filed and cataloged for future retrieval and analysis; (3) the system can handle more than 9,000,000 ft (2700 km) of log data in over 60,000 files; (4) all data can be edited; (5) searches and listings can be made using factors such as formation names; (6) facsimile reproductions can be made of any log on file; (7) a screening program turns the system into a sophisticated hand calculator to quickly determine zones of interest; and (8) up to 1100 ft (335 m) of contiguous data from a well can be analyzed in one run. Innovative features include: (1) a discriminating factor to separate reservoirs for individual attention concerning rock type, fluid content and potential reserves; and (2) a written report of each reservoir using artificial intelligence. The report discusses, among other things, the rock type and its consistency, comparing the system finding with the geologist's opinion. Differences between the two will elicit alternative analyses.

  6. Quasi-likelihood estimation for relative risk regression models.

    PubMed

    Carter, Rickey E; Lipsitz, Stuart R; Tilley, Barbara C

    2005-01-01

    For a prospective randomized clinical trial with two groups, the relative risk can be used as a measure of treatment effect and is directly interpretable as the ratio of success probabilities in the new treatment group versus the placebo group. For a prospective study with many covariates and a binary outcome (success or failure), relative risk regression may be of interest. If we model the log of the success probability as a linear function of covariates, the regression coefficients are log-relative risks. However, using such a log-linear model with a Bernoulli likelihood can lead to convergence problems in the Newton-Raphson algorithm. This is likely to occur when the success probabilities are close to one. A constrained likelihood method proposed by Wacholder (1986, American Journal of Epidemiology 123, 174-184), also has convergence problems. We propose a quasi-likelihood method of moments technique in which we naively assume the Bernoulli outcome is Poisson, with the mean (success probability) following a log-linear model. We use the Poisson maximum likelihood equations to estimate the regression coefficients without constraints. Using method of moment ideas, one can show that the estimates using the Poisson likelihood will be consistent and asymptotically normal. We apply these methods to a double-blinded randomized trial in primary biliary cirrhosis of the liver (Markus et al., 1989, New England Journal of Medicine 320, 1709-1713). PMID:15618526

  7. EthoLog 2.2: a tool for the transcription and timing of behavior observation sessions.

    PubMed

    Ottoni, E B

    2000-08-01

    EthoLog is a tool that aids in the transcription and timing of behavior observation sessions--experimental or naturalistic, from video/audio tapes or registering real time. It was created with Visual Basic and runs on Windows (3.x/9x). The user types the key codes for the predefined behavioral categories, and EthoLog registers their sequence and timing and saves the resulting data in ASCII output files. A sequential analysis matrix can be generated from the sequential data. The output files may be edited, converted to plain text files for printing, or exported to a spreadsheet program, such as MS Excel, for further analyses.

  8. Detailed evaluation of gas hydrate reservoir properties using JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well downhole well-log displays

    USGS Publications Warehouse

    Collett, T.S.

    1999-01-01

    The JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well project was designed to investigate the occurrence of in situ natural gas hydrate in the Mallik area of the Mackenzie Delta of Canada. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas-hydrate-bearing sediments. Downhole logging tool strings deployed in the Mallik 2L-38 well included the Schlumberger Platform Express with a high resolution laterolog, Array Induction Imager Tool, Dipole Shear Sonic Imager, and a Fullbore Formation Microlmager. The downhole log data obtained from the log- and core-inferred gas-hydrate-bearing sedimentary interval (897.25-1109.5 m log depth) in the Mallik 2L-38 well is depicted in a series of well displays. Also shown are numerous reservoir parameters, including gas hydrate saturation and sediment porosity log traces, calculated from available downhole well-log and core data. The gas hydrate accumulation delineated by the Mallik 2L-38 well has been determined to contain as much as 4.15109 m3 of gas in the 1 km2 area surrounding the drill site.

  9. An O(log sup 2 N) parallel algorithm for computing the eigenvalues of a symmetric tridiagonal matrix

    NASA Technical Reports Server (NTRS)

    Swarztrauber, Paul N.

    1989-01-01

    An O(log sup 2 N) parallel algorithm is presented for computing the eigenvalues of a symmetric tridiagonal matrix using a parallel algorithm for computing the zeros of the characteristic polynomial. The method is based on a quadratic recurrence in which the characteristic polynomial is constructed on a binary tree from polynomials whose degree doubles at each level. Intervals that contain exactly one zero are determined by the zeros of polynomials at the previous level which ensures that different processors compute different zeros. The exact behavior of the polynomials at the interval endpoints is used to eliminate the usual problems induced by finite precision arithmetic.

  10. A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Saupe, Joe L.

    This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…

  11. Likelihood and clinical trials.

    PubMed

    Hill, G; Forbes, W; Kozak, J; MacNeill, I

    2000-03-01

    The history of the application of statistical theory to the analysis of clinical trials is reviewed. The current orthodoxy is a somewhat illogical hybrid of the original theory of significance tests of Edgeworth, Karl Pearson, and Fisher, and the subsequent decision theory approach of Neyman, Egon Pearson, and Wald. This hegemony is under threat from Bayesian statisticians. A third approach is that of likelihood, stemming from the work of Fisher and Barnard. This approach is illustrated using hypothetical data from the Lancet articles by Bradford Hill, which introduced clinicians to statistical theory. PMID:10760630

  12. Maximum likelihood solution for inclination-only data in paleomagnetism

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2010-08-01

    We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.

  13. Reconstruction of ancestral genomic sequences using likelihood.

    PubMed

    Elias, Isaac; Tuller, Tamir

    2007-03-01

    A challenging task in computational biology is the reconstruction of genomic sequences of extinct ancestors, given the phylogenetic tree and the sequences at the leafs. This task is best solved by calculating the most likely estimate of the ancestral sequences, along with the most likely edge lengths. We deal with this problem and also the variant in which the phylogenetic tree in addition to the ancestral sequences need to be estimated. The latter problem is known to be NP-hard, while the computational complexity of the former is unknown. Currently, all algorithms for solving these problems are heuristics without performance guarantees. The biological importance of these problems calls for developing better algorithms with guarantees of finding either optimal or approximate solutions. We develop approximation, fix parameter tractable (FPT), and fast heuristic algorithms for two variants of the problem; when the phylogenetic tree is known and when it is unknown. The approximation algorithm guarantees a solution with a log-likelihood ratio of 2 relative to the optimal solution. The FPT has a running time which is polynomial in the length of the sequences and exponential in the number of taxa. This makes it useful for calculating the optimal solution for small trees. Moreover, we combine the approximation algorithm and the FPT into an algorithm with arbitrary good approximation guarantee (PTAS). We tested our algorithms on both synthetic and biological data. In particular, we used the FPT for computing the most likely ancestral mitochondrial genomes of hominidae (the great apes), thereby answering an interesting biological question. Moreover, we show how the approximation algorithms find good solutions for reconstructing the ancestral genomes for a set of lentiviruses (relatives of HIV). Supplementary material of this work is available at www.nada.kth.se/~isaac/publications/aml/aml.html.

  14. R{sup 2}log R quantum corrections and the inflationary observables

    SciTech Connect

    Ben-Dayan, Ido; Westphal, Alexander; Jing, Shenglin; Torabian, Mahdi; Zarate, Lucila E-mail: shenglin.jing@utoronto.ca E-mail: alexander.westphal@desy.de

    2014-09-01

    We study a model of inflation with terms quadratic and logarithmic in the Ricci scalar, where the gravitational action is f(R)=R+α R{sup 2}+β R{sup 2} ln R. These terms are expected to arise from one loop corrections involving matter fields in curved space-time. The spectral index n{sub s} and the tensor to scalar ratio yield 4 × 10{sup -4}∼< r∼<0.03 and 0.94∼< n{sub s} ∼< 0.99. i.e. r is an order of magnitude bigger or smaller than the original Starobinsky model which predicted r∼ 10{sup -3}. Further enhancement of r gives a scale invariant n{sub s}∼ 1 or higher. Other inflationary observables are d n{sub s}/dln k ∼> -5.2 × 10{sup -4}, μ ∼< 2.1 × 10{sup -8} , y ∼< 2.6 × 10{sup -9}. Despite the enhancement in r, if the recent BICEP2 measurement stands, this model is disfavoured.

  15. Real Time Mud Gas Logging During Drilling of DFDP-2B

    NASA Astrophysics Data System (ADS)

    Mathewson, L. A.; Toy, V.; Menzies, C. D.; Zimmer, M.; Erzinger, J.; Niedermann, S.; Cox, S.

    2015-12-01

    The Deep Fault Drilling Project (DFDP) aims to improve our understanding of the Alpine Fault Zone, a tectonically active mature fault system in New Zealand known to rupture in large events, by deep scientific drilling. The borehole DFDP-2B approached the Alpine Fault at depth, reaching a final depth of 892 m (820 m true vertical depth). Online gas analysis (OLGA) while drilling tracked changes in the composition of gases extracted from the circulating drill mud. The composition of fluids from fault zones can provide information about their origins, flow rates and -paths, fluid-rock interactions along these paths, and the permeability structure of the faulted rock mass. Apart from an atmospheric input, the gases in drilling mud derive from the pore space of rock, crushed at the drill bit, and from permeable layers intersected by the borehole. The rapid formation of mud wall cake seals the borehole from further fluid inflow, hence formation-derived gases enter mostly at the depth of the drill bit. OLGA analyses N2, O2, Ar, CO2, CH4, He, and H2 on a mass spectrometer, hydrocarbons CH4, C2H6, C3H8, i-C4H10, and n-C4H10 on a gas chromatograph, and Rn using a lucas-cell detector. Gas was sampled for offline analyses on noble gas and stable isotopes to complement the OLGA dataset. The principle formation-derived gases found in drilling mud during drilling of DFDP-2 were CO2 and CH4, with smaller component of H2 and He2. High radon activity is interpreted to reflect intervals of active fluid flow through highly fractured and faulted rock. 3He/4He values in many samples were extremely air-contaminated, i.e. there was almost no excess of non-atmospheric He. The 3He/4He values measured at 236 m and 610 m, which are the only analyses with uncertainties <100%, are very similar to those measured in hot springs along the Alpine Fault, e.g. Fox River (0.64 Ra), Copland (0.42 Ra), Lower Wanganui (0.81 Ra). We will compare these data to those gathered using OLGA and discuss the

  16. In vitro predictability of drug-drug interaction likelihood of P-glycoprotein-mediated efflux of dabigatran etexilate based on [I]2/IC50 threshold.

    PubMed

    Kishimoto, Wataru; Ishiguro, Naoki; Ludwig-Schwellinger, Eva; Ebner, Thomas; Schaefer, Olaf

    2014-02-01

    Dabigatran etexilate, an oral, reversible, competitive, and direct thrombin inhibitor, is an in vitro and in vivo substrate of P-glycoprotein (P-gp). Dabigatran etexilate was proposed as an in vivo probe substrate for intestinal P-gp inhibition in a recent guidance on drug-drug interactions (DDI) from the European Medicines Agency (EMA) and the Food and Drug Administration (FDA). We conducted transcellular transport studies across Caco-2 cell monolayers with dabigatran etexilate in the presence of various P-gp inhibitors to examine how well in vitro IC50 data, in combination with mathematical equations provided by regulatory guidances, predict DDI likelihood. From a set of potential P-gp inhibitors, clarithromycin, cyclosporin A, itraconazole, ketoconazole, quinidine, and ritonavir inhibited P-gp-mediated transport of dabigatran etexilate over a concentration range that may hypothetically occur in the intestine. IC50 values of P-gp inhibitors for dabigatran etexilate transport were comparable to those of digoxin, a well established in vitro and in vivo P-gp substrate. However, IC50 values varied depending whether they were calculated from efflux ratios or permeability coefficients. Prediction of DDI likelihood of P-gp inhibitors using IC50 values, the hypothetical concentration of P-gp inhibitors, and the cut-off value recommended by both the FDA and EMA were in line with the DDI occurrence in clinical studies with dabigatran etexilate. However, it has to be kept in mind that validity of the cut-off criteria proposed by the FDA and EMA depends on in vitro experimental systems and the IC50-calculation methods that are employed, as IC50 values are substantially influenced by these factors.

  17. Global market integration increases likelihood that a future African Green Revolution could increase crop land use and CO2 emissions.

    PubMed

    Hertel, Thomas W; Ramankutty, Navin; Baldos, Uris Lantz C

    2014-09-23

    There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing. PMID:25201962

  18. Global market integration increases likelihood that a future African Green Revolution could increase crop land use and CO2 emissions.

    PubMed

    Hertel, Thomas W; Ramankutty, Navin; Baldos, Uris Lantz C

    2014-09-23

    There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing.

  19. The Arabidopsis RING E3 ubiquitin ligase AtAIRP3/LOG2 participates in positive regulation of high-salt and drought stress responses.

    PubMed

    Kim, Jong Hum; Kim, Woo Taek

    2013-07-01

    Really Interesting New Gene (RING) E3 ubiquitin ligases have been implicated in cellular responses to the stress hormone abscisic acid (ABA) as well as to environmental stresses in higher plants. Here, an ABA-insensitive RING protein3 (atairp3) loss-of-function mutant line in Arabidopsis (Arabidopsis thaliana) was isolated due to its hyposensitivity to ABA during its germination stage as compared with wild-type plants. AtAIRP3 contains a single C3HC4-type RING motif, a putative myristoylation site, and a domain associated with RING2 (DAR2) domain. Unexpectedly, AtAIRP3 was identified as LOSS OF GDU2 (LOG2), which was recently shown to participate in an amino acid export system via interaction with GLUTAMINE DUMPER1. Thus, AtAIRP3 was renamed as AtAIRP3/LOG2. Transcript levels of AtAIRP3/LOG2 were up-regulated by drought, high salinity, and ABA, suggesting a role for this factor in abiotic stress responses. The atairp3/log2-2 knockout mutant and 35S:AtAIRP3-RNAi knockdown transgenic plants displayed impaired ABA-mediated seed germination and stomata closure. Cosuppression and complementation studies further supported a positive role for AtAIRP3/LOG2 in ABA responses. Suppression of AtAIRP3/LOG2 resulted in marked hypersensitive phenotypes toward high salinity and water deficit relative to wild-type plants. These results suggest that Arabidopsis RING E3 AtAIRP3/LOG2 is a positive regulator of the ABA-mediated drought and salt stress tolerance mechanism. Using yeast (Saccharomyces cerevisiae) two-hybrid, in vitro, and in vivo immunoprecipitation, cell-free protein degradation, and in vitro ubiquitination assays, RESPONSIVE TO DEHYDRATION21 was identified as a substrate protein of AtAIRP3/LOG2. Collectively, our data suggest that AtAIRP3/LOG2 plays dual functions in ABA-mediated drought stress responses and in an amino acid export pathway in Arabidopsis.

  20. MetaPIGA v2.0: maximum likelihood large phylogeny estimation using the metapopulation genetic algorithm and other stochastic heuristics

    PubMed Central

    2010-01-01

    Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2

  1. Approximate likelihood-ratio test for branches: A fast, accurate, and powerful alternative.

    PubMed

    Anisimova, Maria; Gascuel, Olivier

    2006-08-01

    We revisit statistical tests for branches of evolutionary trees reconstructed upon molecular data. A new, fast, approximate likelihood-ratio test (aLRT) for branches is presented here as a competitive alternative to nonparametric bootstrap and Bayesian estimation of branch support. The aLRT is based on the idea of the conventional LRT, with the null hypothesis corresponding to the assumption that the inferred branch has length 0. We show that the LRT statistic is asymptotically distributed as a maximum of three random variables drawn from the chi(0)2 + chi(1)2 distribution. The new aLRT of interior branch uses this distribution for significance testing, but the test statistic is approximated in a slightly conservative but practical way as 2(l1- l2), i.e., double the difference between the maximum log-likelihood values corresponding to the best tree and the second best topological arrangement around the branch of interest. Such a test is fast because the log-likelihood value l2 is computed by optimizing only over the branch of interest and the four adjacent branches, whereas other parameters are fixed at their optimal values corresponding to the best ML tree. The performance of the new test was studied on simulated 4-, 12-, and 100-taxon data sets with sequences of different lengths. The aLRT is shown to be accurate, powerful, and robust to certain violations of model assumptions. The aLRT is implemented within the algorithm used by the recent fast maximum likelihood tree estimation program PHYML (Guindon and Gascuel, 2003).

  2. Revision to CDC's Zika Travel Notices: Minimal Likelihood for Mosquito-Borne Zika Virus Transmission at Elevations Above 2,000 Meters.

    PubMed

    Cetron, Martin

    2016-03-18

    Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission.

  3. Revision to CDC's Zika Travel Notices: Minimal Likelihood for Mosquito-Borne Zika Virus Transmission at Elevations Above 2,000 Meters.

    PubMed

    Cetron, Martin

    2016-03-18

    Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission. PMID:26985965

  4. SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping

    NASA Technical Reports Server (NTRS)

    Cowart, Hugh S.; Scott, David W.

    2014-01-01

    A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.

  5. Log-burning stove

    SciTech Connect

    Choate, J.R.

    1982-11-23

    A log-burning stove having a stove door with an angled plate element secured thereto, the top portion of the plate element deflecting combustion gases inwardly to the combustion chamber, and the lower portion deflecting draft air inwardly and downwardly into the combustion chamber, the plate element also forming a log-support and log-sliding surface.

  6. Binary Logistic Regression Analysis for Detecting Differential Item Functioning: Effectiveness of R[superscript 2] and Delta Log Odds Ratio Effect Size Measures

    ERIC Educational Resources Information Center

    Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.

    2014-01-01

    The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…

  7. A hybrid likelihood algorithm for risk modelling.

    PubMed

    Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D

    1995-03-01

    The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154

  8. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  9. High-temperature scintillation properties of orthorhombic Gd2Si2O7 aiming at well logging

    NASA Astrophysics Data System (ADS)

    Tsubota, Youichi; Kaneko, Junichi H.; Higuchi, Mikio; Nishiyama, Shusuke; Ishibashi, Hiroyuki

    2015-06-01

    Scintillation and luminescence properties of orthorhombic Gd2Si2O7:Ce (GPS:Ce) single-crystal scintillators were investigated for temperatures ranging from room temperature (RT) to 573 K. Orthorhombic GPS crystals were grown by using a top-seeded solution growth (TSSG) method. The scintillation light yield of the orthorhombic GPS at RT was ∼2.9 times higher than that of Gd2SiO5:Ce (GSO). The light yield values of the orthorhombic GPS (Ce = 2.5%) were almost unchanged for temperatures ranging from RT to 523 K, and at 523 K, were higher than twice the light yield of GSO at RT. These GPS scintillators are expected to contribute to oil exploration at greater depths.

  10. The dud-alternative effect in likelihood judgment.

    PubMed

    Windschitl, Paul D; Chambers, John R

    2004-01-01

    The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged likelihood of a focal outcome. This dud-alternative effect was detected for judgments involving uncertainty about trivia facts and stochastic events. Nonnumeric likelihood measures and betting measures reliably detected the effect, but numeric likelihood measures did not. Time pressure increased the magnitude of the effect. The results were consistent with a contrast-effect account: The inclusion of duds increases the perceived strength of the evidence for the focal outcome, thereby affecting its judged likelihood.

  11. On the likelihood of forests

    NASA Astrophysics Data System (ADS)

    Shang, Yilun

    2016-08-01

    How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.

  12. Porosity Log Prediction Using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier

    2016-08-01

    Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.

  13. Measurement of the top quark mass with the dynamical likelihood method using lepton plus jets events with b-tags in p anti-p collisions at s**(1/2) = 1.96-TeV

    SciTech Connect

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst. Phys. /Argonne /Barcelona, IFAE /Baylor U. /INFN, Bologna /Bologna U. /Brandeis U. /UC, Davis /UCLA /UC, San Diego /UC, Santa Barbara

    2005-12-01

    This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.

  14. Selective Logging, Fire, and Biomass in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  15. Well log and 2D seismic data character of the Wilcox Group in south-central Louisiana

    USGS Publications Warehouse

    Enomoto, Catherine B.

    2014-01-01

    The Wilcox Group is productive in updip areas of Texas and Louisiana from fluvial, deltaic, and near-shore marine shelf sandstones. The reported presence of porous sandstones at 29,000 feet within the Wilcox Group containing about 200 feet of gas in the Davy Jones 1 discovery well in the offshore Louisiana South Marsh Island area illustrates a sand-rich system developed during the Paleocene and early Eocene. This study describes some of the well log and reflection seismic data characteristics of the slope and basin-floor reservoirs with gas-discovery potential that may be in the area between the producing trend onshore Louisiana and the offshore discovery.

  16. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  17. Approximate maximum likelihood estimation of scanning observer templates

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Samuelson, Frank W.; Wunderlich, Adam; Popescu, Lucretiu M.; Eckstein, Miguel P.; Boone, John M.

    2015-03-01

    In localization tasks, an observer is asked to give the location of some target or feature of interest in an image. Scanning linear observer models incorporate the search implicit in this task through convolution of an observer template with the image being evaluated. Such models are becoming increasingly popular as predictors of human performance for validating medical imaging methodology. In addition to convolution, scanning models may utilize internal noise components to model inconsistencies in human observer responses. In this work, we build a probabilistic mathematical model of this process and show how it can, in principle, be used to obtain estimates of the observer template using maximum likelihood methods. The main difficulty of this approach is that a closed form probability distribution for a maximal location response is not generally available in the presence of internal noise. However, for a given image we can generate an empirical distribution of maximal locations using Monte-Carlo sampling. We show that this probability is well approximated by applying an exponential function to the scanning template output. We also evaluate log-likelihood functions on the basis of this approximate distribution. Using 1,000 trials of simulated data as a validation test set, we find that a plot of the approximate log-likelihood function along a single parameter related to the template profile achieves its maximum value near the true value used in the simulation. This finding holds regardless of whether the trials are correctly localized or not. In a second validation study evaluating a parameter related to the relative magnitude of internal noise, only the incorrect localization images produces a maximum in the approximate log-likelihood function that is near the true value of the parameter.

  18. The Dud-Alternative Effect in Likelihood Judgment

    ERIC Educational Resources Information Center

    Windschitl, Paul D.; Chambers, John R.

    2004-01-01

    The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…

  19. A Simple 2D Non-Parametric Resampling Statistical Approach to Assess Confidence in Species Identification in DNA Barcoding—An Alternative to Likelihood and Bayesian Approaches

    PubMed Central

    Jin, Qian; He, Li-Jun; Zhang, Ai-Bing

    2012-01-01

    In the recent worldwide campaign for the global biodiversity inventory via DNA barcoding, a simple and easily used measure of confidence for assigning sequences to species in DNA barcoding has not been established so far, although the likelihood ratio test and the Bayesian approach had been proposed to address this issue from a statistical point of view. The TDR (Two Dimensional non-parametric Resampling) measure newly proposed in this study offers users a simple and easy approach to evaluate the confidence of species membership in DNA barcoding projects. We assessed the validity and robustness of the TDR approach using datasets simulated under coalescent models, and an empirical dataset, and found that TDR measure is very robust in assessing species membership of DNA barcoding. In contrast to the likelihood ratio test and Bayesian approach, the TDR method stands out due to simplicity in both concepts and calculations, with little in the way of restrictive population genetic assumptions. To implement this approach we have developed a computer program package (TDR1.0beta) freely available from ftp://202.204.209.200/education/video/TDR1.0beta.rar. PMID:23239988

  20. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  1. Arabidopsis lonely guy (LOG) multiple mutants reveal a central role of the LOG-dependent pathway in cytokinin activation.

    PubMed

    Tokunaga, Hiroki; Kojima, Mikiko; Kuroha, Takeshi; Ishida, Takashi; Sugimoto, Keiko; Kiba, Takatoshi; Sakakibara, Hitoshi

    2012-01-01

    Cytokinins are phytohormones that play key roles in the maintenance of stem cell activity in plants. Although alternative single-step and two-step activation pathways for cytokinin have been proposed, the significance of the single-step pathway which is catalyzed by LONELY GUY (LOG), is not fully understood. We analyzed the metabolic flow of cytokinin activation in Arabidopsis log multiple mutants using stable isotope-labeled tracers and characterized the mutants' morphological and developmental phenotypes. In tracer experiments, cytokinin activation was inhibited most pronouncedly by log7, while the other log mutations had cumulative effects. Although sextuple or lower-order mutants did not show drastic phenotypes in vegetative growth, the log1log2log3log4log5log7log8 septuple T-DNA insertion mutant in which the LOG-dependent pathway is impaired, displayed severe retardation of shoot and root growth with defects in the maintenance of the apical meristems. Detailed observation of the mutants showed that LOG7 was required for the maintenance of shoot apical meristem size. LOG7 was also suggested to play a role for normal primary root growth together with LOG3 and LOG4. These results suggest a dominant role of the single-step activation pathway mediated by LOGs for cytokinin production, and overlapping but differentiated functions of the members of the LOG gene family in growth and development.

  2. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2.

    PubMed

    Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A

    2016-02-01

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/-  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.

  3. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2

    NASA Astrophysics Data System (ADS)

    Scandurra, D.; Albertini, F.; van der Meer, R.; Meier, G.; Weber, D. C.; Bolsi, A.; Lomax, A.

    2016-02-01

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/-  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.

  4. Assessing the quality of proton PBS treatment delivery using machine log files: comprehensive analysis of clinical treatments delivered at PSI Gantry 2.

    PubMed

    Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A

    2016-02-01

    Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within  +/-  1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field. PMID:26767316

  5. The Alpine Fault Hanging Wall Viewed from Within: Structural and Lithological Analysis of Acoustic Televiewer Logs in the DFDP-2B Borehole, New Zealand

    NASA Astrophysics Data System (ADS)

    Massiot, C.; Doan, M. L.; Celerier, B. P.; Mcnamara, D. D.; Little, T. A.; Schmitt, D. R.; Pezard, P. A.; Remaud, L.; Townend, J.; Sutherland, R.; Toy, V.

    2015-12-01

    The Alpine Fault is the major dextral transpressive fault forming the Australia-Pacific plate boundary in the South Island of New Zealand. The DFDP-2B borehole, drilled in late 2014, reached a depth of 893 m (measured depth) and provides an opportunity to describe the protolith and structure of ductilely deformed schists in the hangingwall of this active fault. A total of 19 km of wireline logging data were acquired in the borehole, including 4.8 km of borehole televiewer (BHTV) acoustic image logs. The high resolution of the BHTV logs and the borehole's inclination, which reached a maximum of 46° from vertical near the base of the hole, provide good conditions in which to intersect the predominant foliation, which dips SE at a high angle to the hole as well as other feature sets. Here, we present a detailed structural analysis and lithological interpretation of the BHTV logs from the 264-886 m interval. The borehole's inclination and the attenuation caused by the drilling mud reduces the image quality in places. However, some zones of poor image quality also concur with indications of permeable fractures and are related to fractured zones with localized borehole wall damage. Foliation can be regularly identified to a depth of 808 m (n=406). The mean orientation of foliation planes (dip of 57° towards 145) remains quite consistent throughout the drilled interval and is similar to that inferred for the structurally underlying Alpine Fault. Larger-scale changes in image amplitude suggest lithological or petrophysical variations not yet identified on cuttings. The majority of the 1650 planar features identified as fractures are subparallel to the foliation and may represent deformed veins or other mineral segregations that contribute to the foliation's internal structure. Other planar features have dominantly NW-SE strikes, approximately orthogonal to the foliation and are inferred to be fractures. Truncations or offsets are observed on ~20 features suggesting

  6. CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS

    EPA Science Inventory

    Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...

  7. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...

  8. Postfire logging in riparian areas.

    PubMed

    Reeves, Gordon H; Bisson, Peter A; Rieman, Bruce E; Benda, Lee E

    2006-08-01

    We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend to rebound relatively quickly, usually within a decade after a wildfire. Additionally, fire and subsequent erosion events contribute wood and coarse sediment that can create and maintain productive aquatic habitats over time. The potential effects of postfire logging in riparian areas depend on the landscape context and disturbance history of a site; however available evidence suggests two key management implications: (1) fire in riparian areas creates conditions that may not require intervention to sustain the long-term productivity of the aquatic network and (2) protection of burned riparian areas gives priority to what is left rather than what is removed. Research is needed to determine how postfire logging in riparian areas has affected the spread of invasive species and the vulnerability of upland forests to insect and disease outbreaks and how postfire logging will affect the frequency and behavior of future fires. The effectiveness of using postfire logging to restore desired riparian structure and function is therefore unproven, but such projects are gaining interest with the departure of forest conditions from those that existed prior to timber harvest, fire suppression, and climate change. In the absence of reliable information about the potential consequence of postfire timber harvest, we conclude that providing postfire riparian zones with the same environmental protections they received before they burned isjustified ecologically Without a commitment to monitor management experiments, the effects of postfire riparian logging will remain unknown and highly contentious.

  9. Digital Photography as an Educational Food Logging Tool in Obese Patients with Type 2 Diabetes: Lessons Learned from A Randomized, Crossover Pilot Trial

    PubMed Central

    Ehrmann, Brett J.; Anderson, Robert M.; Piatt, Gretchen A.; Funnell, Martha M.; Rashid, Hira; Shedden, Kerby; Douyon, Liselle

    2014-01-01

    Purpose The purpose of this pilot study is to investigate the utility of, and areas of refinement for, digital photography as an educational tool for food logging in obese patients with type 2 diabetes (T2DM). Methods Thirty-three patients aged 18-70 with T2DM, BMI at least 30 kg/m2, and A1C 7.5-9% were recruited from an endocrinology clinic and randomized to a week of food logging using a digital camera (DC) or paper diary (PD), crossing over for week two. Patients then viewed a presentation about dietary effects on blood glucose, using patient DC and blood glucose entries. Outcomes of adherence (based on number of weekly entries), changes in mean blood glucose and frequency of blood glucose checks, and patient satisfaction were compared between methods. Patient feedback on the DC intervention and presentation was also analyzed. Results Thirty patients completed the study. Adherence was identical across methods. The mean difference in number of entries was not significant between methods. This difference increased and neared statistical significance (favoring DC) among patients who were adherent for at least one week (21 entries, with 2 entries per day for 5 of 7 days, n=25). Mean blood glucose did not significantly decrease in either method. Patient satisfaction was similar between interventions. Feedback indicated concerns over photograph accuracy, forgetting to use the cameras, and embarrassment using them in public. Conclusion Though comparable to PD in adherence, blood glucose changes, and patient satisfaction in this pilot trial, patient feedback suggested specific areas of refinement to maximize utility of DC-based food logging as an educational tool in T2DM. PMID:24168836

  10. Oracle Log Buffer Queueing

    SciTech Connect

    Rivenes, A S

    2004-12-08

    The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.

  11. 6. Log calving barn. Interior view showing log postandbeam support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  12. EE-3A Logging Report

    SciTech Connect

    Anderson, David W.

    1993-12-15

    Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.

  13. The atherogenic dyslipidemia ratio [log(TG)/HDL-C] is associated with residual vascular risk, beta-cell function loss and microangiopathy in type 2 diabetes females

    PubMed Central

    2012-01-01

    Background Atherogenic dyslipidemia (AD), defined as low HDL-C plus elevated triglycerides (TG), comorbid to T2DM, increases cardiometabolic risk for CAD even when LDL-C is at target. In T2DM males, AD was shown to correlate with β-cell function loss, yet it is not established whether this applies across gender. Aim To establish the prevalence and severity of AD in T2DM females, and to determine how it relates to cardiometabolic phenotype, glucose homeostasis, micro- and macrovascular complications, and 10-year absolute CV risk (UKPDS Risk Engine). Methods 340 T2DM females were ranked according to quintiles (Q) of the continuous variable log(TG)/HDL-C, with AD prevalence defined as HDL-C <50 mg.dL-1 plus TG ≥150 mg.dL-1, and β-cell function assessed with HOMA. Results AD prevalence was 35%; mean HDL-C and TG were 52 (15) and 160 (105) mg.dL-1. AD was significantly related to central fat, metabolic syndrome, sedentarity and skeletal sarcopenia, as well as to hsCRP, fibrinogen, uric acid, cystatin-C, Big ET-1, and 10-year UKPDS CV risk. AD correlated stepwise with lower β-cell function and hyperbolic product, and with accelerated loss of residual insulin secretion, higher HbA1c and prevalent microangiopathy. Conclusions log(TG)/HDL-C is a simple means to grade AD and residual macrovascular risk in T2DM females. This ratio associates with major non-LDL cardiometabolic variables and ranks predicted CAD risk. In addition, log(TG)/HDL-C identifies worsening glucose homeostasis, poorer glycemic control, and prevalent microangiopathy. PMID:23046637

  14. log(TG)/HDL-C is related to both residual cardiometabolic risk and β-cell function loss in type 2 diabetes males

    PubMed Central

    2010-01-01

    Background T2DM is associated with atherogenic dyslipidemia (AD), defined as decreased HDL-C plus raised triglycerides (TG). AD confers increased risk for CAD, even when LDL-C is at target. AD is rarely assessed due to lack of screening methods consensus. Aim To establish the prevalence and severity of AD from log(TG)/HDL-C in T2DM males, and to determine how it relates to cardiometabolic phenotype, glucose homeostasis, micro- and macrovascular complications, and 10-year UKPDS CV risk. Methods 585 T2DM males divided according to quintiles (Q) of log(TG)/HDL-C. AD prevalence defined as HDL-C <40 mg.dL-1 plus TG ≥150 mg.dL-1. β-cell function assessed with HOMA. Results Mean HDL-C and TG were 44 (13) and 204 (155) mg.dL-1. AD prevalence was 35%. AD correlated with lower β-cell function, with accelerated loss of insulin secretion, and with poorer HbA1c levels. AD was related to a high prevalence of CAD, and also to 10-year absolute CAD risk. Conclusions log(TG)/HDL-C is a simple means to estimate AD and the residual CV risk it confers in T2DM. AD closely associates with major cardiometabolic and glucose homeostasis determinants and poorer metabolic control. The ratio also relates to macroangiopathy prevalence and ranks future CAD risk, and is well-suited to capture non-LDL-related macrovascular residual risk and major glycemic determinants. PMID:21156040

  15. Experimental study of main rotor tip geometry and tail rotor interactions in hover. Volume 2: Run log and tabulated data

    NASA Technical Reports Server (NTRS)

    Balch, D. T.; Lombardi, J.

    1985-01-01

    A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.

  16. Acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1988-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.

  17. Likelihood estimation in image warping

    NASA Astrophysics Data System (ADS)

    Machado, Alexei M. C.; Campos, Mario F.; Gee, James C.

    1999-05-01

    The problem of matching two images can be posed as the search for a displacement field which assigns each point of one image to a point in the second image in such a way that a likelihood function is maximized ruled by topological constraints. Since the images may be acquired by different scanners, the intensity relationship between intensity levels is generally unknown. The matching problem is usually solved iteratively by optimization methods. The evaluation of each candidate solution is based on an objective function which favors smooth displacements that yield likely intensity matches. This paper is concerned with the construction of a likelihood function that is derived from the information contained in the data and is thus applicable to data acquired from an arbitrary scanner. The basic assumption of the method is that the pair of images to be matched is assumed to contain roughly the same proportion of tissues, which will be reflected in their gray-level histograms. Experiments with MRI images corrupted with strong non-linear intensity shading show the method's effectiveness for modeling intensity artifacts. Image matching can thus be made robust to a wide range of intensity degradations.

  18. Chernoff's density is log-concave.

    PubMed

    Balabdaoui, Fadoua; Wellner, Jon A

    2014-02-01

    We show that the density of Z = argmax{W (t) - t(2)}, sometimes known as Chernoff's density, is log-concave. We conjecture that Chernoff's density is strongly log-concave or "super-Gaussian", and provide evidence in support of the conjecture.

  19. NMR logging apparatus

    DOEpatents

    Walsh, David O; Turner, Peter

    2014-05-27

    Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.

  20. 4. Log chicken house (far left foreground), log bunkhouse (far ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  1. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.

    PubMed

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-08-10

    "Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms.

  2. Structural basis for cytokinin production by LOG from Corynebacterium glutamicum

    PubMed Central

    Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin

    2016-01-01

    “Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425

  3. Interactive Reflective Logs

    ERIC Educational Resources Information Center

    Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina

    2010-01-01

    The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…

  4. Logs Perl Module

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  5. Logging on to Learn

    ERIC Educational Resources Information Center

    Butler, Kevin

    2010-01-01

    A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…

  6. Log of Apollo 11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  7. Coal-log pipeline system development

    SciTech Connect

    Liu, H.

    1991-12-01

    Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)

  8. Efficient computations with the likelihood ratio distribution.

    PubMed

    Kruijver, Maarten

    2015-01-01

    What is the probability that the likelihood ratio exceeds a threshold t, if a specified hypothesis is true? This question is asked, for instance, when performing power calculations for kinship testing, when computing true and false positive rates for familial searching and when computing the power of discrimination of a complex mixture. Answering this question is not straightforward, since there is are a huge number of possible genotypic combinations to consider. Different solutions are found in the literature. Several authors estimate the threshold exceedance probability using simulation. Corradi and Ricciardi [1] propose a discrete approximation to the likelihood ratio distribution which yields a lower and upper bound on the probability. Nothnagel et al. [2] use the normal distribution as an approximation to the likelihood ratio distribution. Dørum et al. [3] introduce an algorithm that can be used for exact computation, but this algorithm is computationally intensive, unless the threshold t is very large. We present three new approaches to the problem. Firstly, we show how importance sampling can be used to make the simulation approach significantly more efficient. Importance sampling is a statistical technique that turns out to work well in the current context. Secondly, we present a novel algorithm for computing exceedance probabilities. The algorithm is exact, fast and can handle relatively large problems. Thirdly, we introduce an approach that combines the novel algorithm with the discrete approximation of Corradi and Ricciardi. This last approach can be applied to very large problems and yields a lower and upper bound on the exceedance probability. The use of the different approaches is illustrated with examples from forensic genetics, such as kinship testing, familial searching and mixture interpretation. The algorithms are implemented in an R-package called DNAprofiles, which is freely available from CRAN.

  9. Maximum likelihood density modification by pattern recognition of structural motifs

    DOEpatents

    Terwilliger, Thomas C.

    2004-04-13

    An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.

  10. Maximum Likelihood Analysis in the PEN Experiment

    NASA Astrophysics Data System (ADS)

    Lehman, Martin

    2013-10-01

    The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.

  11. A forward modeling approach for interpreting impeller flow logs.

    PubMed

    Parker, Alison H; West, L Jared; Odling, Noelle E; Bown, Richard T

    2010-01-01

    A rigorous and practical approach for interpretation of impeller flow log data to determine vertical variations in hydraulic conductivity is presented and applied to two well logs from a Chalk aquifer in England. Impeller flow logging involves measuring vertical flow speed in a pumped well and using changes in flow with depth to infer the locations and magnitudes of inflows into the well. However, the measured flow logs are typically noisy, which leads to spurious hydraulic conductivity values where simplistic interpretation approaches are applied. In this study, a new method for interpretation is presented, which first defines a series of physical models for hydraulic conductivity variation with depth and then fits the models to the data, using a regression technique. Some of the models will be rejected as they are physically unrealistic. The best model is then selected from the remaining models using a maximum likelihood approach. This balances model complexity against fit, for example, using Akaike's Information Criterion.

  12. Energy saving and endurance log for a log building

    SciTech Connect

    Dolata, G.

    1987-03-17

    A log is described for construction of a log building which comprises: an elongated peeled log of substantially uniform diameter along its length with parallel end faces, a bottom surface of the log having a concave surface configuration centered on a diametrical line of the log, a rounded top surface directly opposite from the concave bottom surface which mates with a concave surface of another log when placed upon the rounded top surface, a vertically extending longitudinal slot in the top surface of the log that extends the length of the log, a vertically extending longitudinal slot along at least one side of the log with the slot extending vertically substantially parallel with the diametrical line with the slot being formed outwardly of the concave surface, the log including at least one butt end, the butt end including an end slot along the diametrical line which extends from a top of the log down through the butt end to the concave surface; and the butt includes at least one short, longitudinally extending arcuate groove near an outer surface of the log which extends from a line juxtaposed the end slot down to at least one longitudinal slot in the log.

  13. Behaviours of log phase cultures of eight strains of Escherichia coli incubated at temperatures of 2, 6, 8 and 10 degrees C.

    PubMed

    Gill, C O; Badoni, M; Jones, T H

    2007-11-01

    The behaviours of cold-adapted, log-phase cultures of eight strains of Escherichia coli incubated at 2, 6, 8 and 10 degrees C for 10 days were examined by determining absorbance at 600 nm (A(600)), viable counts and cell size distribution as indicated by forward angle light scattering (FALS) values, obtained for samples collected each day from each culture. Cell lengths were determined from photomicrographs of samples for which the flow cytometry data indicated the mean cell lengths were maximal or minimal for each culture. At 2 degrees C, A(600) values for all strains and viable counts for some changed little, while viable counts for other strains declined progressively by >1 log unit. At 6 degrees C, A(600) values for most strains increased at progressively declining rates and then remained constant while viable counts increased to reach maximum values before maximum A(600) values were attained, and then declined. At 8 degrees C, the behaviours of most strains were similar to the behaviour at 6 degrees C. At 10 degrees C, seven of the strains grew exponentially, but for most of these the growth rate determined from A(600) values differed from that determined from viable count data. Mean FALS values for cultures incubated at 6, 8, or 10 degrees C showed various patterns of increase and decrease, indicating fluctuations in cell lengths. For all strains, the minimum cell length was <3 microm, but the maximum cell lengths ranged from <20 to >140 microm. The findings suggest that the formation of elongated cells or filaments is usual behaviour for E. coli growing at temperatures approaching or below the minimum for sustained growth.

  14. 3D Reservoir Modeling of Semutang Gas Field: A lonely Gas field in Chittagong-Tripura Fold Belt, with Integrated Well Log, 2D Seismic Reflectivity and Attributes.

    NASA Astrophysics Data System (ADS)

    Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.

    2015-12-01

    Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research

  15. 78 FR 69541 - Labor Certification Process for Logging Employment and Non-H-2A Agricultural Employment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... rulemaking (NPRM). 74 FR 45906 (Sept. 4, 2009). After considering comments from the public on the subject... Agricultural Employment of H-2A Aliens in the United States, 75 FR 6884 (Feb. 12, 2010). The effect of..., Health professions, Immigration, Labor, Longshore and harbor work, Migrant workers, Nonimmigrant...

  16. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    NASA Astrophysics Data System (ADS)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  17. Field project to obtain pressure core, wireline log, and production test data for evaluation of CO/sub 2/ flooding potential, Conoco MCA unit well No. 358, Maljamar Field, Lea County, New Mexico

    SciTech Connect

    Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.; Goodrich, J.H.; Kumar, R.M.

    1981-11-01

    This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.

  18. Results of Phase 2 postburn drilling, coring, and logging: Rocky Mountain 1 Underground Coal Gasification Test, Hanna, Wyoming

    SciTech Connect

    Oliver, R.L.; Lindblom, S.R.; Covell, J.R.

    1991-02-01

    The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute`s Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.

  19. Results of Phase 2 postburn drilling, coring, and logging: Rocky Mountain 1 Underground Coal Gasification Test, Hanna, Wyoming

    SciTech Connect

    Oliver, R.L.; Lindblom, S.R.; Covell, J.R.

    1991-02-01

    The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute's Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.

  20. Integrated well log and 2-D seismic data interpretation to image the subsurface stratigraphy and structure in north-eastern Bornu (Chad) basin

    NASA Astrophysics Data System (ADS)

    Isyaku, Aminu A.; Rust, Derek; Teeuw, Richard; Whitworth, Malcolm

    2016-09-01

    Structural and stratigraphic mapping within the Bornu Basin in north east Nigeria was commonly carried out using traditional field geological methods. However, such traditional approaches remain inadequate in the semi-arid region characterised by topographically flat areas and lack of continuous bedrock outcrops that are mostly concealed beneath sand cover. Previous studies in the north-eastern part of the basin carried out using ditch cuttings from few wells and disconnected seismic data were largely inadequate and the resulting stratigraphic analyses were more often generalised. This paper presents an integrated structural and stratigraphic study of the basin using combined subsurface geophysical datasets. A Combined Log Pattern (CLP) method is a well log analysis, which utilises various well log data including gamma ray, resistivity, bulk density and sonic logs to identify lithology and stratigraphic boundaries of subsurface formations. This method is applied to constrain the subsurface stratigraphy of the north-eastern part of the Bornu Basin bordering the Lake Chad. In addition to qualitative combined well log analysis, the time-depth relationship of the sonic log and seismic data was quantitatively determined by tying a well with an intersecting seismic section to validate the stratigraphic facies horizons identified. Four well log facies and their environments of deposition were characterised from the combined well log analysis of the different log types. It is discovered that the Cretaceous basement structural features controlled the deposition of overlying formations in the basin. Without intact core data, the shallower wells were discovered to have bottomed over subsurface horst features while deeper wells penetrated into the basal facies contained mainly within the grabens. Main subsurface structural lineaments in the area include NW-SE, NE-SW and NNW-SSE trending faults, which mainly formed the horst and graben features. Some stratigraphic formations

  1. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  2. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  3. My Journey with Learning Logs

    ERIC Educational Resources Information Center

    Hurst, Beth

    2005-01-01

    Learning logs, or reading response logs, have long been established as an effective reading strategy that helps students learn from text (Atwell, 1987; Blough & Berman, 1991; Calkins, 1986; Commander & Smith, 1996; Kuhrt & Farris, 1990; Reed, 1988; Sanders, 1985). In this paper, the author describes her experiences using learning logs as a…

  4. Memory logging tools

    SciTech Connect

    Lysne, P. )

    1993-01-01

    A suite of memory logging tools is being developed for slim-hole, high-temperature applications. The first tool will measure pressure and temperature, and it is especially designed to be a low-cost, but very precise, device. This tool will be smart in the sense that it can make [open quotes]decisions[close quotes] downhole. A second tool will measure the gamma-ray emissions from formation material, and it will be an indicator of lithology. A fluid sampling tool will complement the spectral gamma tool in that is will further constrain the lithology by identifying dissolved formation materials. Plans are being made to integrate the tool responses into formation parameters of interest through cooperative efforts between geothermal operators and scientific drilling programs.

  5. Salvage logging, ecosystem processes, and biodiversity conservation.

    PubMed

    Lindenmayer, D B; Noss, R F

    2006-08-01

    We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case.

  6. Likelihood methods for cluster dark energy surveys

    SciTech Connect

    Hu, Wayne; Cohn, J. D.

    2006-03-15

    Galaxy cluster counts at high redshift, binned into spatial pixels and binned into ranges in an observable proxy for mass, contain a wealth of information on both the dark energy equation of state and the mass selection function required to extract it. The likelihood of the number counts follows a Poisson distribution whose mean fluctuates with the large-scale structure of the universe. We develop a joint likelihood method that accounts for these distributions. Maximization of the likelihood over a theoretical model that includes both the cosmology and the observable-mass relations allows for a joint extraction of dark energy and cluster structural parameters.

  7. Model Fit after Pairwise Maximum Likelihood

    PubMed Central

    Barendse, M. T.; Ligtvoet, R.; Timmerman, M. E.; Oort, F. J.

    2016-01-01

    Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log–likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two–way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations. PMID:27148136

  8. Maximum-Likelihood Detection Of Noncoherent CPM

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  9. Quasi-likelihood for Spatial Point Processes

    PubMed Central

    Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus

    2014-01-01

    Summary Fitting regression models for intensity functions of spatial point processes is of great interest in ecological and epidemiological studies of association between spatially referenced events and geographical or environmental covariates. When Cox or cluster process models are used to accommodate clustering not accounted for by the available covariates, likelihood based inference becomes computationally cumbersome due to the complicated nature of the likelihood function and the associated score function. It is therefore of interest to consider alternative more easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation which in practise is solved numerically. The derivation of the optimal estimating function has close similarities to the derivation of quasi-likelihood for standard data sets. The approximate solution is further equivalent to a quasi-likelihood score for binary spatial data. We therefore use the term quasi-likelihood for our optimal estimating function approach. We demonstrate in a simulation study and a data example that our quasi-likelihood method for spatial point processes is both statistically and computationally efficient. PMID:26041970

  10. Terminology Extraction from Log Files

    NASA Astrophysics Data System (ADS)

    Saneifar, Hassan; Bonniol, Stéphane; Laurent, Anne; Poncelet, Pascal; Roche, Mathieu

    The log files generated by digital systems can be used in management information systems as the source of important information on the condition of systems. However, log files are not exhaustively exploited in order to extract information. The classical methods of information extraction such as terminology extraction methods are irrelevant to this context because of the specific characteristics of log files like their heterogeneous structure, the special vocabulary and the fact that they do not respect a natural language grammar. In this paper, we introduce our approach Exterlog to extract the terminology from log files. We detail how it deals with the particularity of such textual data.

  11. 3. Log bunkhouse (far left), log chicken house (left of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT

  12. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    NASA Astrophysics Data System (ADS)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  13. Well Logging with Californium-252

    SciTech Connect

    Boulogne, A.R.

    2003-01-06

    Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.

  14. Logs Wanted - Dead or Alive

    NASA Astrophysics Data System (ADS)

    Schuchardt, A.; Morche, D.

    2015-12-01

    Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley

  15. A complete X-ray sample of the high-latitude /absolute value of b greater than 20 deg/ sky from HEAO 1 A-2 - Log N-log S and luminosity functions

    NASA Technical Reports Server (NTRS)

    Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.

    1982-01-01

    An all-sky survey of X-ray sources was performed, complete to a limiting sensitivity of 3.1 x 10 to the -11 ergs/sq cm/s in the 2-10 keV band. The complete sample has allowed construction of luminosity functions based on a flux-limited sample for clusters of galaxies and active galactic nuclei. Integration of the best-fit luminosity functions indicates that clusters of galaxies contribute about 4% of the 2-10 keV DXRB, and active galactic nuclei about 20%. It is predicted that many of the objects seen in the deep survey should be local, relatively low luminosity active galactic nuclei and clusters of galaxies.

  16. Likelihood-based population independent component analysis

    PubMed Central

    Eloyan, Ani; Crainiceanu, Ciprian M.; Caffo, Brian S.

    2013-01-01

    Independent component analysis (ICA) is a widely used technique for blind source separation, used heavily in several scientific research areas including acoustics, electrophysiology, and functional neuroimaging. We propose a scalable two-stage iterative true group ICA methodology for analyzing population level functional magnetic resonance imaging (fMRI) data where the number of subjects is very large. The method is based on likelihood estimators of the underlying source densities and the mixing matrix. As opposed to many commonly used group ICA algorithms, the proposed method does not require significant data reduction by a 2-fold singular value decomposition. In addition, the method can be applied to a large group of subjects since the memory requirements are not restrictive. The performance of our approach is compared with a commonly used group ICA algorithm via simulation studies. Furthermore, the proposed method is applied to a large collection of resting state fMRI datasets. The results show that established brain networks are well recovered by the proposed algorithm. PMID:23314416

  17. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE).

    PubMed

    Boker, Steven M; Brick, Timothy R; Pritikin, Joshua N; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D; Maes, Hermine H; Neale, Michael C

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participant's personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual's data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies. PMID:26717128

  18. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE).

    PubMed

    Boker, Steven M; Brick, Timothy R; Pritikin, Joshua N; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D; Maes, Hermine H; Neale, Michael C

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participant's personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual's data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies.

  19. Maintained Individual Data Distributed Likelihood Estimation (MIDDLE)

    PubMed Central

    Boker, Steven M.; Brick, Timothy R.; Pritikin, Joshua N.; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D.; Maes, Hermine H.; Neale, Michael C.

    2015-01-01

    Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly-impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participants’ personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual’s data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies. PMID:26717128

  20. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, T.S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The "standard" and "quick look" Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in all of the gas hydrate accumulations assessed in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  1. Well log evaluation of gas hydrate saturations

    USGS Publications Warehouse

    Collett, Timothy S.

    1998-01-01

    The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The `standard' and `quick look' Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.

  2. A Strategy for Interpretation of Microearthquake Tomography Results in the Salton Sea Geothermal Field Based upon Rock Physics Interpretations of State 2-14 Borehole Logs

    SciTech Connect

    Bonner, B; Hutchings, L; Kasameyer, P

    2006-06-14

    We devise a strategy for analysis of Vp and Vs microearthquake tomography results in the Salton Sea geothermal field to identify important features of the geothermal reservoir. We first interpret rock properties in State 2-14 borehole based upon logged core through the reservoir. Then, we interpret seismic recordings in the well (Daley et al., 1988) to develop the strategy. We hypothesize that mapping Poisson's ratio has two applications for the Salton Sea geothermal reservoir: (1) to map the top of the reservoir, and (2) as a diagnostic for permeable zones. Poisson's ratio can be obtained from Vp and Vs. In the State 2-14 borehole, Poisson's ratio calculated from large scale averages ({approx} 150 m) shows a monotonic decrease with depth to about 1300 m, at which point it increases with depth. Our model is that the monotonic decrease is due to compaction, and the increase below 1300 m is due to the rocks being hydrothermally altered. We hypothesize we can map the depth to alteration by identifying the transition from decreasing to increasing values; and thus, map the top of the reservoir, which is associated with a known increase in sulfite, chlorite, and epidote alteration that may be indicative of hydrothermal activity. We also observe (from Daley et. al. plots) an anomalous drop in Poisson's ratio at a depth of about 900 m, within a sandstone formation. The sandstone has a P-wave velocity significantly higher than the siltstone above it but a lower velocity in the lower half of the formation relative to the upper half. We interpret the relative decrease in velocity to be due to fracturing and chemical alteration caused by permeability. We conclude that using Vp and Vs tomography results to obtain images of Poisson's ratio has the potential to identify significant features in the geothermal reservoir in this geologic setting. Seismic attenuation tomography results (mapped as Qp and Qs) should also be useful for evaluating geothermal reservoirs, but that is not

  3. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  4. Enhanced carbon-oxygen log interpretations using supplemental log curves

    SciTech Connect

    Wyatt, D.F. Jr.; Jacobson, L.A.; Fox, P.

    1994-12-31

    Supplemental measurements from induced nuclear spectrometry tools are examined to demonstrate what additional information they provide about the well and reservoir conditions. Logs in shut-in wells from Indonesia provide examples of oxygen activation measurements showing cross-flow from one reservoir to another via open perforations. Leaking squeezed perforations were also observed. An example from Alaska shows radioactive scale build-up in the casing which spectral analysis identifies as a mixture of uranium and thorium salts. Another log, where the casing fluid was replaced with crude oil, demonstrates a technique for identifying cement channels. Logs from Nigeria comparing oil saturation estimates before and after a squeeze operation illustrate the effect of casing fluid flushing of the formation through open perforations. Understanding the diagnostic character of these curves leads to higher confidence in the overall log interpretation process.

  5. Maximum likelihood clustering with dependent feature trees

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B. (Principal Investigator)

    1981-01-01

    The decomposition of mixture density of the data into its normal component densities is considered. The densities are approximated with first order dependent feature trees using criteria of mutual information and distance measures. Expressions are presented for the criteria when the densities are Gaussian. By defining different typs of nodes in a general dependent feature tree, maximum likelihood equations are developed for the estimation of parameters using fixed point iterations. The field structure of the data is also taken into account in developing maximum likelihood equations. Experimental results from the processing of remotely sensed multispectral scanner imagery data are included.

  6. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...

  7. Growing local likelihood network: Emergence of communities

    NASA Astrophysics Data System (ADS)

    Chen, S.; Small, M.

    2015-10-01

    In many real situations, networks grow only via local interactions. New nodes are added to the growing network with information only pertaining to a small subset of existing nodes. Multilevel marketing, social networks, and disease models can all be depicted as growing networks based on local (network path-length) distance information. In these examples, all nodes whose distance from a chosen center is less than d form a subgraph. Hence, we grow networks with information only from these subgraphs. Moreover, we use a likelihood-based method, where at each step we modify the networks by changing their likelihood to be closer to the expected degree distribution. Combining the local information and the likelihood method, we grow networks that exhibit novel features. We discover that the likelihood method, over certain parameter ranges, can generate networks with highly modulated communities, even when global information is not available. Communities and clusters are abundant in real-life networks, and the method proposed here provides a natural mechanism for the emergence of communities in scale-free networks. In addition, the algorithmic implementation of network growth via local information is substantially faster than global methods and allows for the exploration of much larger networks.

  8. Numerical likelihood analysis of cosmic ray anisotropies

    SciTech Connect

    Carlos Hojvat et al.

    2003-07-02

    A numerical likelihood approach to the determination of cosmic ray anisotropies is presented which offers many advantages over other approaches. It allows a wide range of statistically meaningful hypotheses to be compared even when full sky coverage is unavailable, can be readily extended in order to include measurement errors, and makes maximum unbiased use of all available information.

  9. New materials for fireplace logs

    NASA Technical Reports Server (NTRS)

    Kieselback, D. J.; Smock, A. W.

    1971-01-01

    Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.

  10. Gaussian Mixture Models of Between-Source Variation for Likelihood Ratio Computation from Multivariate Data

    PubMed Central

    Franco-Pedroso, Javier; Ramos, Daniel; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    In forensic science, trace evidence found at a crime scene and on suspect has to be evaluated from the measurements performed on them, usually in the form of multivariate data (for example, several chemical compound or physical characteristics). In order to assess the strength of that evidence, the likelihood ratio framework is being increasingly adopted. Several methods have been derived in order to obtain likelihood ratios directly from univariate or multivariate data by modelling both the variation appearing between observations (or features) coming from the same source (within-source variation) and that appearing between observations coming from different sources (between-source variation). In the widely used multivariate kernel likelihood-ratio, the within-source distribution is assumed to be normally distributed and constant among different sources and the between-source variation is modelled through a kernel density function (KDF). In order to better fit the observed distribution of the between-source variation, this paper presents a different approach in which a Gaussian mixture model (GMM) is used instead of a KDF. As it will be shown, this approach provides better-calibrated likelihood ratios as measured by the log-likelihood ratio cost (Cllr) in experiments performed on freely available forensic datasets involving different trace evidences: inks, glass fragments and car paints. PMID:26901680

  11. Intra-Die Spatial Correlation Extraction with Maximum Likelihood Estimation Method for Multiple Test Chips

    NASA Astrophysics Data System (ADS)

    Fu, Qiang; Luk, Wai-Shing; Tao, Jun; Zeng, Xuan; Cai, Wei

    In this paper, a novel intra-die spatial correlation extraction method referred to as MLEMTC (Maximum Likelihood Estimation for Multiple Test Chips) is presented. In the MLEMTC method, a joint likelihood function is formulated by multiplying the set of individual likelihood functions for all test chips. This joint likelihood function is then maximized to extract a unique group of parameter values of a single spatial correlation function, which can be used for statistical circuit analysis and design. Moreover, to deal with the purely random component and measurement error contained in measurement data, the spatial correlation function combined with the correlation of white noise is used in the extraction, which significantly improves the accuracy of the extraction results. Furthermore, an LU decomposition based technique is developed to calculate the log-determinant of the positive definite matrix within the likelihood function, which solves the numerical stability problem encountered in the direct calculation. Experimental results have shown that the proposed method is efficient and practical.

  12. Use of historical information in a maximum-likelihood framework

    USGS Publications Warehouse

    Cohn, T.A.; Stedinger, J.R.

    1987-01-01

    This paper discusses flood-quantile estimators which can employ historical and paleoflood information, both when the magnitudes of historical flood peaks are known, and when only threshold-exceedance information is available. Maximum likelihood, quasi-maximum likelihood and curve fitting methods for simultaneous estimation of 1, 2 and 3 unknown parameters are examined. The information contained in a 100 yr record of historical observations, during which the flood perception threshold was near the 10 yr flood level (i.e., on average, one flood in ten is above the threshold and hence is recorded), is equivalent to roughly 43, 64 and 78 years of systematic record in terms of the improvement of the precision of 100 yr flood estimators when estimating 1, 2 and 3 parameters, respectively. With the perception threshold at the 100 yr flood level, the historical data was worth 13, 20 and 46 years of systematic data when estimating 1, 2 and 3 parameters, respectively. ?? 1987.

  13. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 80.409 Section 80.409... MARITIME SERVICES Station Documents § 80.409 Station logs. (a) General requirements. Logs must be established and properly maintained as follows: (1) The log must be kept in an orderly manner. The log may...

  14. MEMLET: An Easy-to-Use Tool for Data Fitting and Model Comparison Using Maximum-Likelihood Estimation.

    PubMed

    Woody, Michael S; Lewis, John H; Greenberg, Michael J; Goldman, Yale E; Ostap, E Michael

    2016-07-26

    We present MEMLET (MATLAB-enabled maximum-likelihood estimation tool), a simple-to-use and powerful program for utilizing maximum-likelihood estimation (MLE) for parameter estimation from data produced by single-molecule and other biophysical experiments. The program is written in MATLAB and includes a graphical user interface, making it simple to integrate into the existing workflows of many users without requiring programming knowledge. We give a comparison of MLE and other fitting techniques (e.g., histograms and cumulative frequency distributions), showing how MLE often outperforms other fitting methods. The program includes a variety of features. 1) MEMLET fits probability density functions (PDFs) for many common distributions (exponential, multiexponential, Gaussian, etc.), as well as user-specified PDFs without the need for binning. 2) It can take into account experimental limits on the size of the shortest or longest detectable event (i.e., instrument "dead time") when fitting to PDFs. The proper modification of the PDFs occurs automatically in the program and greatly increases the accuracy of fitting the rates and relative amplitudes in multicomponent exponential fits. 3) MEMLET offers model testing (i.e., single-exponential versus double-exponential) using the log-likelihood ratio technique, which shows whether additional fitting parameters are statistically justifiable. 4) Global fitting can be used to fit data sets from multiple experiments to a common model. 5) Confidence intervals can be determined via bootstrapping utilizing parallel computation to increase performance. Easy-to-follow tutorials show how these features can be used. This program packages all of these techniques into a simple-to-use and well-documented interface to increase the accessibility of MLE fitting. PMID:27463130

  15. Predicting reservoir wettability via well logs

    NASA Astrophysics Data System (ADS)

    Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang

    2016-06-01

    Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.

  16. cosmoabc: Likelihood-free inference for cosmology

    NASA Astrophysics Data System (ADS)

    Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.

    2015-05-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.

  17. MAXIMUM LIKELIHOOD ESTIMATION FOR SOCIAL NETWORK DYNAMICS

    PubMed Central

    Snijders, Tom A.B.; Koskinen, Johan; Schweinberger, Michael

    2014-01-01

    A model for network panel data is discussed, based on the assumption that the observed data are discrete observations of a continuous-time Markov process on the space of all directed graphs on a given node set, in which changes in tie variables are independent conditional on the current graph. The model for tie changes is parametric and designed for applications to social network analysis, where the network dynamics can be interpreted as being generated by choices made by the social actors represented by the nodes of the graph. An algorithm for calculating the Maximum Likelihood estimator is presented, based on data augmentation and stochastic approximation. An application to an evolving friendship network is given and a small simulation study is presented which suggests that for small data sets the Maximum Likelihood estimator is more efficient than the earlier proposed Method of Moments estimator. PMID:25419259

  18. Sampling variability in forensic likelihood-ratio computation: A simulation study.

    PubMed

    Ali, Tauseef; Spreeuwers, Luuk; Veldhuis, Raymond; Meuwly, Didier

    2015-12-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called "likelihood-ratio" when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biometric recognition system outputs a score and therefore a likelihood-ratio computation method is used to convert the score to a likelihood-ratio. The likelihood-ratio is the probability of the score given the hypothesis of the prosecution, Hp (the two biometric specimens arose from a same source), divided by the probability of the score given the hypothesis of the defense, Hd (the two biometric specimens arose from different sources). Given a set of training scores under Hp and a set of training scores under Hd, several methods exist to convert a score to a likelihood-ratio. In this work, we focus on the issue of sampling variability in the training sets and carry out a detailed empirical study to quantify its effect on commonly proposed likelihood-ratio computation methods. We study the effect of the sampling variability varying: 1) the shapes of the probability density functions which model the distributions of scores in the two training sets; 2) the sizes of the training sets and 3) the score for which a likelihood-ratio is computed. For this purpose, we introduce a simulation framework which can be used to study several properties of a likelihood-ratio computation method and to quantify the effect of sampling variability in the likelihood-ratio computation. It is empirically shown that the sampling variability can be considerable, particularly when the training sets are small. Furthermore, a given method of likelihood-ratio computation can behave very differently for different shapes of the probability density functions of the scores in the training sets and different scores for which likelihood-ratios are computed.

  19. Depth dependent multiple logging system

    SciTech Connect

    Howells, A. P. S.; Angehrn, J. A.; Dienglewicz, A. M.; Viswanathan, R.

    1985-12-03

    An improved well logging technique is provided for more accurately deriving and correlating a plurality of measurements made during a single traversal of a logging instrument through subsurface formations. In one exemplary embodiment, methods and apparatus are provided for deriving a more accurate and precise measurement of depth at which real-time logging measurements are made, and in particular for correcting anomalies occurring in the depth indication from cable stretch, yo-yo of the sonde in the borehole and the like. The more accurate and precise depth measurement is then utilized for generating well logging measurements on a depth-dependent basis, deriving at least some of such measurements in digital form and alternately transmitting to the surface digital and analog representations of such measurements. Furthermore, methods and apparatus are provided for deriving measurements of subsurface earth formation from a plurality of logging instruments combined in a single tool, wherein such measurements are made during a single pass through a borehole with the resultant measurement data correlatively merged, recorded and displayed.

  20. APS logDaemon and client library

    SciTech Connect

    Saunders, C.; Kowalkowski, J.

    1995-12-13

    This document serves as a User`s Manual and Reference for the logDaemon and client library. This package provides a general distributed message logging system. A logDaemon may be started anywhere on a subnet. A client which has linked in the client library is provided functions to open a connection to the logDaemon, log messages, and close the connection. The logDaemon maintains one or more log files (in simple ASCII or SDDS format) and an e-mail list based on specifications in a configuration file. Incoming messages are logged to the appropriate file and/or result in e-mail being sent.

  1. Maximum likelihood continuity mapping for fraud detection

    SciTech Connect

    Hogden, J.

    1997-05-01

    The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.

  2. Likelihood reinstates Archaeopteryx as a primitive bird

    PubMed Central

    Lee, Michael S. Y.; Worthy, Trevor H.

    2012-01-01

    The widespread view that Archaeopteryx was a primitive (basal) bird has been recently challenged by a comprehensive phylogenetic analysis that placed Archaeopteryx with deinonychosaurian theropods. The new phylogeny suggested that typical bird flight (powered by the front limbs only) either evolved at least twice, or was lost/modified in some deinonychosaurs. However, this parsimony-based result was acknowledged to be weakly supported. Maximum-likelihood and related Bayesian methods applied to the same dataset yield a different and more orthodox result: Archaeopteryx is restored as a basal bird with bootstrap frequency of 73 per cent and posterior probability of 1. These results are consistent with a single origin of typical (forelimb-powered) bird flight. The Archaeopteryx–deinonychosaur clade retrieved by parsimony is supported by more characters (which are on average more homoplasious), whereas the Archaeopteryx–bird clade retrieved by likelihood-based methods is supported by fewer characters (but on average less homoplasious). Both positions for Archaeopteryx remain plausible, highlighting the hazy boundary between birds and advanced theropods. These results also suggest that likelihood-based methods (in addition to parsimony) can be useful in morphological phylogenetics. PMID:22031726

  3. Log of Trench 04A Across the Hayward Fault at Tyson's Lagoon (Tule Pond), Fremont, Alameda County, California

    USGS Publications Warehouse

    Lienkaemper, James J.; Williams, Patrick L.; Sickler, Robert R.; Fumal, Thomas E.

    2005-01-01

    This publication makes available a detailed trench log (sheets 1 and 2) of a 110-m trench we excavated in 2004 across a tectonic sag pond in the Hayward fault zone. Also included are revised stratigraphic unit descriptions from this fifth field season of subsurface investigation of the Hayward fault at Tyson's Lagoon (Tule Pond). Preliminary findings based on fieldwork done in 2000 have been published (Lienkaemper and others: data archive, 2002a; report, 2002b), as were the logs and data for 2001-2003 (Lienkaemper and others, 2003, L03). A continuous exposure of the geologic section across the entire pond made in 2004 (Fig. 1, 04A) has revealed some critical miscorrelations of units made in the original on-line version of L03, hence users of these earlier trench data should only use the 2005 revised version 2.0 of L03 for correlation purposes. Lienkaemper, Williams, and Sickler interpreted the geology and logged the trenches. Fumal did most of the trench photography. The Hayward fault is recognized to be among the most hazardous in the United States (Working Group on California Earthquake Probabilities, 2003). Establishing a chronology of prehistoric or paleoearthquakes is of immediate use in resolving the likelihood of future large earthquakes Hayward fault. This document makes available geologic evidence for historical and prehistoric surface-rupturing earthquakes preserved at the site. A second, formal report on our conclusions based on these data is in preparation.

  4. Method for induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1987-04-14

    A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.

  5. Likelihood-based modification of experimental crystal structure electron density maps

    DOEpatents

    Terwilliger, Thomas C.

    2005-04-16

    A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.

  6. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    NASA Astrophysics Data System (ADS)

    Bechtle, Philip; Heinemeyer, Sven; Stål, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-09-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full dataset. In addition to publishing a exclusion limit, the full likelihood information for the narrow resonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the search and the rate measurements of the SM-like Higgs at in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http://higgsbounds.hepforge.org.

  7. Maximal likelihood correspondence estimation for face recognition across pose.

    PubMed

    Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang

    2014-10-01

    Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database. PMID:25163062

  8. A New Approach to Logging.

    ERIC Educational Resources Information Center

    Miles, Donna

    2001-01-01

    In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…

  9. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  10. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  11. 47 CFR 73.781 - Logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...

  12. 29 CFR 1918.88 - Log operations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...

  13. Likelihood analysis of spatial capture-recapture models for stratified or class structured populations

    USGS Publications Warehouse

    Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.

    2015-01-01

    We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.

  14. [Log-linear model used in the hybrid design of case-parents triad/control-mother dyad].

    PubMed

    Peng, W J; Zhang, H; Li, Y X; Li, C Y; Yan, W R

    2016-06-01

    This study introduced the application of a log-linear model in the hybrid design of case-parents triad/control-mother dyad. Data related to the association between cleft lip with palate (CLP) and methylenetetrahydrofolate reductase (MTHFR) gene A1298C diversity was analyzed. Log-linear model based on likelihood ratio tests (LRTs) was used to analyze the relationships between mother, offspring genotypes and CLP. Data from our study noticed that children of mothers carrying the CC genotype presented a lower risk of CLP, comparing with the children of mothers carrying the AA genotype, with S2=0.45 (95%CI: 0.26-0.79). Offspring that carrying the AC genotype presented a lower risk of CLP, comparing with the offspring that carrying the AA genotype, with R1=0.69 (95% CI: 0.48-0.97). However, no other types of relationships were found. The power of hybrid design was greater than the case-parents study (0.86>0.78). MTHFR A1298C polymorphism seemed to have played an important role in the etiology on both cleft lip and palate. Data from the hybrid design and the log-linear model could help researchers to explore the effects of genotypes from both mothers and the offspring. This study design would present stronger power than the regular case-parents studies thus suitable for studies on the etiology of diseases in early lives, as birth defects. PMID:27346122

  15. Intelligence's likelihood and evolutionary time frame

    NASA Astrophysics Data System (ADS)

    Bogonovich, Marc

    2011-04-01

    This paper outlines hypotheses relevant to the evolution of intelligent life and encephalization in the Phanerozoic. If general principles are inferable from patterns of Earth life, implications could be drawn for astrobiology. Many of the outlined hypotheses, relevant data, and associated evolutionary and ecological theory are not frequently cited in astrobiological journals. Thus opportunity exists to evaluate reviewed hypotheses with an astrobiological perspective. A quantitative method is presented for testing one of the reviewed hypotheses (hypothesis i; the diffusion hypothesis). Questions are presented throughout, which illustrate that the question of intelligent life's likelihood can be expressed as multiple, broadly ranging, more tractable questions.

  16. Score-based likelihood ratios for handwriting evidence.

    PubMed

    Hepler, Amanda B; Saunders, Christopher P; Davis, Linda J; Buscaglia, JoAnn

    2012-06-10

    Score-based approaches for computing forensic likelihood ratios are becoming more prevalent in the forensic literature. When two items of evidential value are entangled via a scorefunction, several nuances arise when attempting to model the score behavior under the competing source-level propositions. Specific assumptions must be made in order to appropriately model the numerator and denominator probability distributions. This process is fairly straightforward for the numerator of the score-based likelihood ratio, entailing the generation of a database of scores obtained by pairing items of evidence from the same source. However, this process presents ambiguities for the denominator database generation - in particular, how best to generate a database of scores between two items of different sources. Many alternatives have appeared in the literature, three of which we will consider in detail. They differ in their approach to generating denominator databases, by pairing (1) the item of known source with randomly selected items from a relevant database; (2) the item of unknown source with randomly generated items from a relevant database; or (3) two randomly generated items. When the two items differ in type, perhaps one having higher information content, these three alternatives can produce very different denominator databases. While each of these alternatives has appeared in the literature, the decision of how to generate the denominator database is often made without calling attention to the subjective nature of this process. In this paper, we compare each of the three methods (and the resulting score-based likelihood ratios), which can be thought of as three distinct interpretations of the denominator proposition. Our goal in performing these comparisons is to illustrate the effect that subtle modifications of these propositions can have on inferences drawn from the evidence evaluation procedure. The study was performed using a data set composed of cursive writing

  17. Synthesizing regression results: a factored likelihood method.

    PubMed

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-06-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported in the regression studies to calculate synthesized standardized slopes. It uses available correlations to estimate missing ones through a series of regressions, allowing us to synthesize correlations among variables as if each included study contained all the same variables. Great accuracy and stability of this method under fixed-effects models were found through Monte Carlo simulation. An example was provided to demonstrate the steps for calculating the synthesized slopes through sweep operators. By rearranging the predictors in the included regression models or omitting a relatively small number of correlations from those models, we can easily apply the factored likelihood method to many situations involving synthesis of linear models. Limitations and other possible methods for synthesizing more complicated models are discussed. Copyright © 2012 John Wiley & Sons, Ltd. PMID:26053653

  18. CORA: Emission Line Fitting with Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Ness, Jan-Uwe; Wichmann, Rainer

    2011-12-01

    The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.

  19. CORA - emission line fitting with Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Ness, J.-U.; Wichmann, R.

    2002-07-01

    The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.

  20. Diffusion Tensor Estimation by Maximizing Rician Likelihood

    PubMed Central

    Landman, Bennett; Bazin, Pierre-Louis; Prince, Jerry

    2012-01-01

    Diffusion tensor imaging (DTI) is widely used to characterize white matter in health and disease. Previous approaches to the estimation of diffusion tensors have either been statistically suboptimal or have used Gaussian approximations of the underlying noise structure, which is Rician in reality. This can cause quantities derived from these tensors — e.g., fractional anisotropy and apparent diffusion coefficient — to diverge from their true values, potentially leading to artifactual changes that confound clinically significant ones. This paper presents a novel maximum likelihood approach to tensor estimation, denoted Diffusion Tensor Estimation by Maximizing Rician Likelihood (DTEMRL). In contrast to previous approaches, DTEMRL considers the joint distribution of all observed data in the context of an augmented tensor model to account for variable levels of Rician noise. To improve numeric stability and prevent non-physical solutions, DTEMRL incorporates a robust characterization of positive definite tensors and a new estimator of underlying noise variance. In simulated and clinical data, mean squared error metrics show consistent and significant improvements from low clinical SNR to high SNR. DTEMRL may be readily supplemented with spatial regularization or a priori tensor distributions for Bayesian tensor estimation. PMID:23132746

  1. Developmental Changes in Children's Understanding of Future Likelihood and Uncertainty

    ERIC Educational Resources Information Center

    Lagattuta, Kristin Hansen; Sayfan, Liat

    2011-01-01

    Two measures assessed 4-10-year-olds' and adults' (N = 201) understanding of future likelihood and uncertainty. In one task, participants sequenced sets of event pictures varying by one physical dimension according to increasing future likelihood. In a separate task, participants rated characters' thoughts about the likelihood of future events,…

  2. Maximum Likelihood Estimation of Multivariate Polyserial and Polychoric Correlation Coefficients.

    ERIC Educational Resources Information Center

    Poon, Wai-Yin; Lee, Sik-Yum

    1987-01-01

    Reparameterization is used to find the maximum likelihood estimates of parameters in a multivariate model having some component variable observable only in polychotomous form. Maximum likelihood estimates are found by a Fletcher Powell algorithm. In addition, the partition maximum likelihood method is proposed and illustrated. (Author/GDC)

  3. Chemical logging of geothermal wells

    DOEpatents

    Allen, Charles A.; McAtee, Richard E.

    1981-01-01

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  4. Chemical logging of geothermal wells

    DOEpatents

    Allen, C.A.; McAtee, R.E.

    The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.

  5. Precision pressure/temperature logging tool

    SciTech Connect

    Henfling, J.A.; Normann, R.A.

    1998-01-01

    Past memory logging tools have provided excellent pressure/temperature data when used in a geothermal environment, and they are easier to maintain and deploy than tools requiring an electric wireline connection to the surface. However, they are deficient since the tool operator is unaware of downhole conditions that could require changes in the logging program. Tools that make ``decisions`` based on preprogrammed scenarios can partially overcome this difficulty, and a suite of such memory tools has been developed at Sandia National Laboratories. The first tool, which forms the basis for future instruments, measures pressure and temperature. Design considerations include a minimization of cost while insuring quality data, size compatibility with diamond-cored holes, operation in holes to 425 C (800 F), transportability by ordinary passenger air service, and ease of operation. This report documents the development and construction of the pressure/temperature tool. It includes: (1) description of the major components; (2) calibration; (3) typical logging scenario; (4) tool data examples; and (5) conclusions. The mechanical and electrical drawings, along with the tool`s software, will be furnished upon request.

  6. A composite lithology log while drilling

    SciTech Connect

    Tannenbaum, E.; Sutcliffe, B.; Franks, A.

    1988-02-01

    A new method for producing a computerized composite lithology log (CLL) while drilling by integrating MWD (measurement while drilling) and surface data is described. The CLL integrates three types of data (MWD mechanical, MWD geophysical, and surface cuttings) acquired during drilling, in three time stages: (1) Real Time. MWD drilling mechanical data including the rate of penetration and the downhole torque. This stage would provide bed boundaries and some inferred lithology. This would assist the driller with immediate drilling decisions and determine formation tops for coring, casing point, and correlation. (2) MWD Time. Recomputation of the above by adding MWD geophysical data (gamma-ray, resistivity, neutron-density). This stage would upgrade the lithology inference, and give higher resolution of bed boundaries. (3) Lag Time. Detailed analysis of surface cuttings to confirm the inferred lithologies. This last input will result in a high-quality CLL with accurate lithologies and bed boundaries. The log will serve the geologist as well as the driller, petrophysicist, and reservoir engineer. It will form the basis for more comprehensive formation evaluation while drilling by adding hydrocarbon and MWD log data.

  7. Multimodal Likelihoods in Educational Assessment: Will the Real Maximum Likelihood Score Please Stand up?

    ERIC Educational Resources Information Center

    Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike

    2011-01-01

    It has been known for some time that item response theory (IRT) models may exhibit a likelihood function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…

  8. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  9. Groups, information theory, and Einstein's likelihood principle

    NASA Astrophysics Data System (ADS)

    Sicuro, Gabriele; Tempesta, Piergiulio

    2016-04-01

    We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.

  10. Groups, information theory, and Einstein's likelihood principle.

    PubMed

    Sicuro, Gabriele; Tempesta, Piergiulio

    2016-04-01

    We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.

  11. Groups, information theory, and Einstein's likelihood principle.

    PubMed

    Sicuro, Gabriele; Tempesta, Piergiulio

    2016-04-01

    We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts. PMID:27176234

  12. A Maximum-Likelihood Approach to Force-Field Calibration.

    PubMed

    Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam

    2015-09-28

    A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-likelihood fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-likelihood function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-likelihood method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-likelihood-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run 2

  13. Self-consistent Modeling of the logN-logS in the Poisson Limit

    NASA Astrophysics Data System (ADS)

    Sourlas, E.; Kashyap, V.; Zezas, A.; van Dyk, D.

    2004-08-01

    logN-logS curves are a fundamental tool in the study of source populations, luminosity functions, and cosmological parameters. However, their determination is hampered by statistical effects such as the Eddington bias, incompleteness due to detection efficiency, faint source flux fluctuations, etc. Here we present a new and powerful method using the full Poisson machinery that allows us to model the logN-logS distribution of X-ray sources in a self-consistent manner. Because we properly account for all the above statistical effects, our modeling is valid over the full range of the data. We use a Bayesian approach, modeling the fluxes with known functional forms such as simple or broken power-laws. The expected photon counts are conditioned on the fluxes, the background contamination, effective area, detector vignetting, and detection probability. The built-in flexibility of the algorithm also allows a simultaneous analysis of multiple datasets. We demonstrate the power of our algorithm by applying it to a set of Chandra observations. This project is part of the California-Harvard/CXC AstroStatistics Collaboration. The authors gratefully acknowledge funding for this project partially provided by NSF grant DMS-01-04129 and by NASA Contract NAS8-39073, and NASA grants NCC2-1350 and NAG5-13056.

  14. TIOGA LAKE, HALL NATURAL AREA, LOG CABIN-SADDLEBAG, AND HORSE MEADOWS ROADLESS AREAS, CALIFORNIA.

    USGS Publications Warehouse

    Seitz, J.F.; Federspiel, F.E.

    1984-01-01

    Studies of the geology and mineral resources of the Tioga Lake, Hall Natural Area, Log Cabin-Saddlebag, and Horse Meadows Roadless Areas in California indicate that parts of the Log Cabin-Saddlebag and Hall Natural Roadless Areas have a substantiated resource potential for gold and (or) silver resources, and a probable potential for tungsten and molybdenum resources. Tioga Lake Roadless Area has little likelihood for the occurrence of mineral resources and the Horse Meadows Roadless Area has a probable potential for low-grade tungsten, gold, and (or) silver resources. The geologic terrane in the roadless areas precludes the occurrence of organic fuel resources.

  15. Avian responses to selective logging shaped by species traits and logging practices.

    PubMed

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin

    2015-06-01

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673

  16. Avian responses to selective logging shaped by species traits and logging practices.

    PubMed

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin

    2015-06-01

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging.

  17. Data Mining of Network Logs

    NASA Technical Reports Server (NTRS)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  18. Limits on Log Odds Ratios for Unidimensional Item Response Theory Models

    ERIC Educational Resources Information Center

    Haberman, Shelby J.; Holland, Paul W.; Sinharay, Sandip

    2007-01-01

    Bounds are established for log odds ratios (log cross-product ratios) involving pairs of items for item response models. First, expressions for bounds on log odds ratios are provided for one-dimensional item response models in general. Then, explicit bounds are obtained for the Rasch model and the two-parameter logistic (2PL) model. Results are…

  19. 40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...

  20. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...

  1. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...

  2. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...

  3. 40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...

  4. 40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...

  5. 40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...

  6. 40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...

  7. 29 CFR 1917.18 - Log handling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations...) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure that structures (bunks) used to contain logs have rounded corners and rounded structural parts to avoid...

  8. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain utilization logs showing for each sealed source the following information: (1) A description, including...

  9. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station log. 73.1820 Section 73.1820... Rules Applicable to All Broadcast Stations § 73.1820 Station log. (a) Entries must be made in the station log either manually by a person designated by the licensee who is in actual charge of...

  10. 47 CFR 87.109 - Station logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 87.109 Section 87.109... Operating Requirements and Procedures Operating Procedures § 87.109 Station logs. (a) A station at a fixed location in the international aeronautical mobile service must maintain a log in accordance with Annex...

  11. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  12. Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions

    PubMed Central

    Barrett, Harrison H.; Dainty, Christopher; Lara, David

    2008-01-01

    Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255

  13. Effect of radiance-to-reflectance transformation and atmosphere removal on maximum likelihood classification accuracy of high-dimensional remote sensing data

    NASA Technical Reports Server (NTRS)

    Hoffbeck, Joseph P.; Landgrebe, David A.

    1994-01-01

    Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum likelihood classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum likelihood classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the log residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum likelihood classification accuracy. The average classification accuracy of the data transformed by ATREM and the log residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum likelihood classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum likelihood classification accuracy.

  14. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    NASA Astrophysics Data System (ADS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-10-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle.

  15. Maximum likelihood inference of reticulate evolutionary histories.

    PubMed

    Yu, Yun; Dong, Jianrong; Liu, Kevin J; Nakhleh, Luay

    2014-11-18

    Hybridization plays an important role in the evolution of certain groups of organisms, adaptation to their environments, and diversification of their genomes. The evolutionary histories of such groups are reticulate, and methods for reconstructing them are still in their infancy and have limited applicability. We present a maximum likelihood method for inferring reticulate evolutionary histories while accounting simultaneously for incomplete lineage sorting. Additionally, we propose methods for assessing confidence in the amount of reticulation and the topology of the inferred evolutionary history. Our method obtains accurate estimates of reticulate evolutionary histories on simulated datasets. Furthermore, our method provides support for a hypothesis of a reticulate evolutionary history inferred from a set of house mouse (Mus musculus) genomes. As evidence of hybridization in eukaryotic groups accumulates, it is essential to have methods that infer reticulate evolutionary histories. The work we present here allows for such inference and provides a significant step toward putting phylogenetic networks on par with phylogenetic trees as a model of capturing evolutionary relationships. PMID:25368173

  16. The maximum likelihood dating of magnetostratigraphic sections

    NASA Astrophysics Data System (ADS)

    Man, Otakar

    2011-04-01

    In general, stratigraphic sections are dated by biostratigraphy and magnetic polarity stratigraphy (MPS) is subsequently used to improve the dating of specific section horizons or to correlate these horizons in different sections of similar age. This paper shows, however, that the identification of a record of a sufficient number of geomagnetic polarity reversals against a reference scale often does not require any complementary information. The deposition and possible subsequent erosion of the section is herein regarded as a stochastic process, whose discrete time increments are independent and normally distributed. This model enables the expression of the time dependence of the magnetic record of section increments in terms of probability. To date samples bracketing the geomagnetic polarity reversal horizons, their levels are combined with various sequences of successive polarity reversals drawn from the reference scale. Each particular combination gives rise to specific constraints on the unknown ages of the primary remanent magnetization of samples. The problem is solved by the constrained maximization of the likelihood function with respect to these ages and parameters of the model, and by subsequent maximization of this function over the set of possible combinations. A statistical test of the significance of this solution is given. The application of this algorithm to various published magnetostratigraphic sections that included nine or more polarity reversals gave satisfactory results. This possible self-sufficiency makes MPS less dependent on other dating techniques.

  17. PAML 4: phylogenetic analysis by maximum likelihood.

    PubMed

    Yang, Ziheng

    2007-08-01

    PAML, currently in version 4, is a package of programs for phylogenetic analyses of DNA and protein sequences using maximum likelihood (ML). The programs may be used to compare and test phylogenetic trees, but their main strengths lie in the rich repertoire of evolutionary models implemented, which can be used to estimate parameters in models of sequence evolution and to test interesting biological hypotheses. Uses of the programs include estimation of synonymous and nonsynonymous rates (d(N) and d(S)) between two protein-coding DNA sequences, inference of positive Darwinian selection through phylogenetic comparison of protein-coding genes, reconstruction of ancestral genes and proteins for molecular restoration studies of extinct life forms, combined analysis of heterogeneous data sets from multiple gene loci, and estimation of species divergence times incorporating uncertainties in fossil calibrations. This note discusses some of the major applications of the package, which includes example data sets to demonstrate their use. The package is written in ANSI C, and runs under Windows, Mac OSX, and UNIX systems. It is available at -- (http://abacus.gene.ucl.ac.uk/software/paml.html).

  18. Physically constrained maximum likelihood mode filtering.

    PubMed

    Papp, Joseph C; Preisig, James C; Morozov, Andrey K

    2010-04-01

    Mode filtering is most commonly implemented using the sampled mode shapes or pseudoinverse algorithms. Buck et al. [J. Acoust. Soc. Am. 103, 1813-1824 (1998)] placed these techniques in the context of a broader maximum a posteriori (MAP) framework. However, the MAP algorithm requires that the signal and noise statistics be known a priori. Adaptive array processing algorithms are candidates for improving performance without the need for a priori signal and noise statistics. A variant of the physically constrained, maximum likelihood (PCML) algorithm [A. L. Kraay and A. B. Baggeroer, IEEE Trans. Signal Process. 55, 4048-4063 (2007)] is developed for mode filtering that achieves the same performance as the MAP mode filter yet does not need a priori knowledge of the signal and noise statistics. The central innovation of this adaptive mode filter is that the received signal's sample covariance matrix, as estimated by the algorithm, is constrained to be that which can be physically realized given a modal propagation model and an appropriate noise model. Shallow water simulation results are presented showing the benefit of using the PCML method in adaptive mode filtering.

  19. Disequilibrium mapping: Composite likelihood for pairwise disequilibrium

    SciTech Connect

    Devlin, B.; Roeder, K.; Risch, N.

    1996-08-15

    The pattern of linkage disequilibrium between a disease locus and a set of marker loci has been shown to be a useful tool for geneticists searching for disease genes. Several methods have been advanced to utilize the pairwise disequilibrium between the disease locus and each of a set of marker loci. However, none of the methods take into account the information from all pairs simultaneously while also modeling the variability in the disequilibrium values due to the evolutionary dynamics of the population. We propose a Composite Likelihood CL model that has these features when the physical distances between the marker loci are known or can be approximated. In this instance, and assuming that there is a single disease mutation, the CL model depends on only three parameters, the recombination fraction between the disease locus and an arbitrary marker locus, {theta}, the age of the mutation, and a variance parameter. When the CL is maximized over a grid of {theta}, it provides a graph that can direct the search for the disease locus. We also show how the CL model can be generalized to account for multiple disease mutations. Evolutionary simulations demonstrate the power of the analyses, as well as their potential weaknesses. Finally, we analyze the data from two mapped diseases, cystic fibrosis and diastrophic dysplasia, finding that the CL method performs well in both cases. 28 refs., 6 figs., 4 tabs.

  20. Dimension-independent likelihood-informed MCMC

    SciTech Connect

    Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  1. Dimension-independent likelihood-informed MCMC

    DOE PAGES

    Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian informationmore » and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.« less

  2. A case study of Membrane Testing, Integrity and Estimations of Log Removal Value per the Long Term 2 Enhanced Surface Water Treatment Rule.

    EPA Science Inventory

    The SWTR-LT2 Rule addresses the use of membranes that may be used to achieve credits for the reduction of Cryptosporidium. The LT2 Rule requires product-specific testing of membranes, considered an alternative filtration technology by the rule. LT2 Rule allows the product speci...

  3. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  4. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    SciTech Connect

    Gopich, Irina V.

    2015-01-21

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.

  5. The logN-logS relationship of normal X-ray emitting galaxies

    NASA Astrophysics Data System (ADS)

    Tajer, M.; Trinchieri, G.; Wolter A.; Campana, S.; Moretti, A.; Tagliaferri, G.

    We have selected a flux limited serendipitous sample of galaxies from the cross-correlation of the BMW (Brera Multiscale Wavelet) ROSAT HRI and the LEDA (Lyon - Meudon Extragalactic Database) Catalogues. This sample is used to study the X-ray properties of normal galaxies in the local universe. We also find that the logN-logS distribution we derived for a serendipitous subsample, optically and X-ray flux limited, is consistent with the euclidean slope in the flux range FX(0.5 - 2) ˜ 1.1 - 110 × 10-14 erg cm-2 s-1. We further show that the same law is valid over 4 decades, from the bright sample derived from the RASS data to the very faint detections in deep XMM-Newton fields.

  6. Logging while fishing technique results in substantial savings

    SciTech Connect

    Tollefsen, E.; Everett, M.

    1996-12-01

    During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data may not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.

  7. Estimating the population density of the Asian tapir (Tapirus indicus) in a selectively logged forest in Peninsular Malaysia.

    PubMed

    Rayan, D Mark; Mohamad, Shariff Wan; Dorward, Leejiah; Aziz, Sheema Abdul; Clements, Gopalasamy Reuben; Christopher, Wong Chai Thiam; Traeholt, Carl; Magintan, David

    2012-12-01

    The endangered Asian tapir (Tapirus indicus) is threatened by large-scale habitat loss, forest fragmentation and increased hunting pressure. Conservation planning for this species, however, is hampered by a severe paucity of information on its ecology and population status. We present the first Asian tapir population density estimate from a camera trapping study targeting tigers in a selectively logged forest within Peninsular Malaysia using a spatially explicit capture-recapture maximum likelihood based framework. With a trap effort of 2496 nights, 17 individuals were identified corresponding to a density (standard error) estimate of 9.49 (2.55) adult tapirs/100 km(2) . Although our results include several caveats, we believe that our density estimate still serves as an important baseline to facilitate the monitoring of tapir population trends in Peninsular Malaysia. Our study also highlights the potential of extracting vital ecological and population information for other cryptic individually identifiable animals from tiger-centric studies, especially with the use of a spatially explicit capture-recapture maximum likelihood based framework. PMID:23253368

  8. Estimating the population density of the Asian tapir (Tapirus indicus) in a selectively logged forest in Peninsular Malaysia.

    PubMed

    Rayan, D Mark; Mohamad, Shariff Wan; Dorward, Leejiah; Aziz, Sheema Abdul; Clements, Gopalasamy Reuben; Christopher, Wong Chai Thiam; Traeholt, Carl; Magintan, David

    2012-12-01

    The endangered Asian tapir (Tapirus indicus) is threatened by large-scale habitat loss, forest fragmentation and increased hunting pressure. Conservation planning for this species, however, is hampered by a severe paucity of information on its ecology and population status. We present the first Asian tapir population density estimate from a camera trapping study targeting tigers in a selectively logged forest within Peninsular Malaysia using a spatially explicit capture-recapture maximum likelihood based framework. With a trap effort of 2496 nights, 17 individuals were identified corresponding to a density (standard error) estimate of 9.49 (2.55) adult tapirs/100 km(2) . Although our results include several caveats, we believe that our density estimate still serves as an important baseline to facilitate the monitoring of tapir population trends in Peninsular Malaysia. Our study also highlights the potential of extracting vital ecological and population information for other cryptic individually identifiable animals from tiger-centric studies, especially with the use of a spatially explicit capture-recapture maximum likelihood based framework.

  9. CMBFIT: Rapid WMAP likelihood calculations with normal parameters

    NASA Astrophysics Data System (ADS)

    Sandvik, Håvard B.; Tegmark, Max; Wang, Xiaomin; Zaldarriaga, Matias

    2004-03-01

    We present a method for ultrafast confrontation of the Wilkinson Microwave Anisotropy Probe (WMAP) cosmic microwave background observations with theoretical models, implemented as a publicly available software package called CMBFIT, useful for anyone wishing to measure cosmological parameters by combining WMAP with other observations. The method takes advantage of the underlying physics by transforming into a set of parameters where the WMAP likelihood surface is accurately fit by the exponential of a quartic or sextic polynomial. Building on previous physics based approximations by Hu et al., Kosowsky et al., and Chu et al., it combines their speed with precision cosmology grade accuracy. A FORTRAN code for computing the WMAP likelihood for a given set of parameters is provided, precalibrated against CMBFAST, accurate to Δ ln L˜0.05 over the entire 2σ region of the parameter space for 6 parameter “vanilla” ΛCDM models. We also provide 7-parameter fits including spatial curvature, gravitational waves and a running spectral index.

  10. Likelihood of attending to the color word modulates Stroop interference.

    PubMed

    Cho, Yang Seok; Choi, Jong Moon; Proctor, Robert W

    2012-02-01

    Three experiments investigated whether the Stroop color-naming effect is modulated by the likelihood of a color word capturing visual attention. In Experiment 1, a bar or a neutral word was presented at fixation as a color carrier, along with a color word randomly appearing in either an achromatic color (white in the main experiment, gray in a follow-up) or purple. Reduction of the Stroop effect (known as Stroop dilution) occurred when the color word was achromatic but not (or to a lesser extent) when it was in purple. In Experiment 2, the color of the color word remained constant throughout trial blocks, and Stroop dilution was equally evident when the word was always in purple and when it was always in white. In Experiment 3, a color bar was presented as the color carrier with both a color word and a neutral word. In this case, the Stroop effect was larger when the color word appeared in purple, and smaller when the neutral word appeared in purple, than when neither word did. These results imply that the extent to which processing of a color word occurs is determined by the likelihood of the word capturing attention.

  11. Maximum-likelihood approach to strain imaging using ultrasound

    PubMed Central

    Insana, M. F.; Cook, L. T.; Bilgen, M.; Chaturvedi, P.; Zhu, Y.

    2009-01-01

    A maximum-likelihood (ML) strategy for strain estimation is presented as a framework for designing and evaluating bioelasticity imaging systems. Concepts from continuum mechanics, signal analysis, and acoustic scattering are combined to develop a mathematical model of the ultrasonic waveforms used to form strain images. The model includes three-dimensional (3-D) object motion described by affine transformations, Rayleigh scattering from random media, and 3-D system response functions. The likelihood function for these waveforms is derived to express the Fisher information matrix and variance bounds for displacement and strain estimation. The ML estimator is a generalized cross correlator for pre- and post-compression echo waveforms that is realized by waveform warping and filtering prior to cross correlation and peak detection. Experiments involving soft tissuelike media show the ML estimator approaches the Cramér–Rao error bound for small scaling deformations: at 5 MHz and 1.2% compression, the predicted lower bound for displacement errors is 4.4 µm and the measured standard deviation is 5.7 µm. PMID:10738797

  12. Planck 2013 results. XV. CMB power spectra and likelihood

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ < 50, our likelihood exploits all Planck frequency channels from 30 to 353 GHz, separating the cosmological CMB signal from diffuse Galactic foregrounds through a physically motivated Bayesian component separation technique. At ℓ ≥ 50, we employ a correlated Gaussian likelihood approximation based on a fine-grained set of angular cross-spectra derived from multiple detector combinations between the 100, 143, and 217 GHz frequency channels, marginalising over power spectrum foreground templates. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the

  13. Maximum Likelihood Bayesian Averaging of Spatial Variability Models in Unsaturated Fractured Tuff

    SciTech Connect

    Ye, Ming; Neuman, Shlomo P.; Meyer, Philip D.

    2004-05-25

    Hydrologic analyses typically rely on a single conceptual-mathematical model. Yet hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Adopting only one of these may lead to statistical bias and underestimation of uncertainty. Bayesian Model Averaging (BMA) provides an optimal way to combine the predictions of several competing models and to assess their joint predictive uncertainty. However, it tends to be computationally demanding and relies heavily on prior information about model parameters. We apply a maximum likelihood (ML) version of BMA (MLBMA) to seven alternative variogram models of log air permeability data from single-hole pneumatic injection tests in six boreholes at the Apache Leap Research Site (ALRS) in central Arizona. Unbiased ML estimates of variogram and drift parameters are obtained using Adjoint State Maximum Likelihood Cross Validation in conjunction with Universal Kriging and Generalized L east Squares. Standard information criteria provide an ambiguous ranking of the models, which does not justify selecting one of them and discarding all others as is commonly done in practice. Instead, we eliminate some of the models based on their negligibly small posterior probabilities and use the rest to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. We then average these four projections, and associated kriging variances, using the posterior probability of each model as weight. Finally, we cross-validate the results by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of MLBMA with that of each individual model. We find that MLBMA is superior to any individual geostatistical model of log permeability among those we consider at the ALRS.

  14. Reviews Equipment: Data logger Book: Imagined Worlds Equipment: Mini data loggers Equipment: PICAXE-18M2 data logger Books: Engineering: A Very Short Introduction and To Engineer Is Human Book: Soap, Science, & Flat-Screen TVs Equipment: uLog and SensorLab Web Watch

    NASA Astrophysics Data System (ADS)

    2012-07-01

    WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource

  15. Leak checker data logging system

    SciTech Connect

    Payne, J.J.; Gannon, J.C.

    1994-12-31

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  16. Leak checker data logging system

    DOEpatents

    Gannon, J.C.; Payne, J.J.

    1996-09-03

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.

  17. Leak checker data logging system

    DOEpatents

    Gannon, Jeffrey C.; Payne, John J.

    1996-01-01

    A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.

  18. Activation Likelihood Estimation meta-analysis revisited

    PubMed Central

    Eickhoff, Simon B.; Bzdok, Danilo; Laird, Angela R.; Kurth, Florian; Fox, Peter T.

    2011-01-01

    A widely used technique for coordinate-based meta-analysis of neuroimaging data is activation likelihood estimation (ALE), which determines the convergence of foci reported from different experiments. ALE analysis involves modelling these foci as probability distributions whose width is based on empirical estimates of the spatial uncertainty due to the between-subject and between-template variability of neuroimaging data. ALE results are assessed against a null-distribution of random spatial association between experiments, resulting in random-effects inference. In the present revision of this algorithm, we address two remaining drawbacks of the previous algorithm. First, the assessment of spatial association between experiments was based on a highly time-consuming permutation test, which nevertheless entailed the danger of underestimating the right tail of the null-distribution. In this report, we outline how this previous approach may be replaced by a faster and more precise analytical method. Second, the previously applied correction procedure, i.e. controlling the false discovery rate (FDR), is supplemented by new approaches for correcting the family-wise error rate and the cluster-level significance. The different alternatives for drawing inference on meta-analytic results are evaluated on an exemplary dataset on face perception as well as discussed with respect to their methodological limitations and advantages. In summary, we thus replaced the previous permutation algorithm with a faster and more rigorous analytical solution for the null-distribution and comprehensively address the issue of multiple-comparison corrections. The proposed revision of the ALE-algorithm should provide an improved tool for conducting coordinate-based meta-analyses on functional imaging data. PMID:21963913

  19. Maximum likelihood molecular clock comb: analytic solutions.

    PubMed

    Chor, Benny; Khetan, Amit; Snir, Sagi

    2006-04-01

    Maximum likelihood (ML) is increasingly used as an optimality criterion for selecting evolutionary trees, but finding the global optimum is a hard computational task. Because no general analytic solution is known, numeric techniques such as hill climbing or expectation maximization (EM), are used in order to find optimal parameters for a given tree. So far, analytic solutions were derived only for the simplest model--three taxa, two state characters, under a molecular clock. Four taxa rooted trees have two topologies--the fork (two subtrees with two leaves each) and the comb (one subtree with three leaves, the other with a single leaf). In a previous work, we devised a closed form analytic solution for the ML molecular clock fork. In this work, we extend the state of the art in the area of analytic solutions ML trees to the family of all four taxa trees under the molecular clock assumption. The change from the fork topology to the comb incurs a major increase in the complexity of the underlying algebraic system and requires novel techniques and approaches. We combine the ultrametric properties of molecular clock trees with the Hadamard conjugation to derive a number of topology dependent identities. Employing these identities, we substantially simplify the system of polynomial equations. We finally use tools from algebraic geometry (e.g., Gröbner bases, ideal saturation, resultants) and employ symbolic algebra software to obtain analytic solutions for the comb. We show that in contrast to the fork, the comb has no closed form solutions (expressed by radicals in the input data). In general, four taxa trees can have multiple ML points. In contrast, we can now prove that under the molecular clock assumption, the comb has a unique (local and global) ML point. (Such uniqueness was previously shown for the fork.).

  20. Four decades of forest persistence, clearance and logging on Borneo.

    PubMed

    Gaveau, David L A; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik

    2014-01-01

    The native forests of Borneo have been impacted by selective logging, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or logging since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively logged forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km2) of Borneo's area (737,188 km2) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km2 (30.2%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km2, representing 10% of Borneo. We mapped 271,819 km of primary logging roads that were created between 1973 and 2010. The greatest density of logging roads was found in Sarawak, at 0.89 km km-2, and the lowest density in Brunei, at 0.18 km km-2. Analyzing MODIS-based tree cover maps, we estimate that logging operated within 700 m of primary logging roads. Using this distance, we estimate that 266,257 km2 of 1973 forest cover has been logged. With 389,566 km2 (52.8%) of the island remaining forested, of which 209,649 km2 remains intact. There is still hope for biodiversity conservation in Borneo. Protecting logged forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo. PMID:25029192

  1. Four decades of forest persistence, clearance and logging on Borneo.

    PubMed

    Gaveau, David L A; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik

    2014-01-01

    The native forests of Borneo have been impacted by selective logging, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or logging since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively logged forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km2) of Borneo's area (737,188 km2) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km2 (30.2%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km2, representing 10% of Borneo. We mapped 271,819 km of primary logging roads that were created between 1973 and 2010. The greatest density of logging roads was found in Sarawak, at 0.89 km km-2, and the lowest density in Brunei, at 0.18 km km-2. Analyzing MODIS-based tree cover maps, we estimate that logging operated within 700 m of primary logging roads. Using this distance, we estimate that 266,257 km2 of 1973 forest cover has been logged. With 389,566 km2 (52.8%) of the island remaining forested, of which 209,649 km2 remains intact. There is still hope for biodiversity conservation in Borneo. Protecting logged forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo.

  2. Four Decades of Forest Persistence, Clearance and Logging on Borneo

    PubMed Central

    Gaveau, David L. A.; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K.; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik

    2014-01-01

    The native forests of Borneo have been impacted by selective logging, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or logging since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively logged forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km2) of Borneo's area (737,188 km2) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km2 (30.2%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km2, representing 10% of Borneo. We mapped 271,819 km of primary logging roads that were created between 1973 and 2010. The greatest density of logging roads was found in Sarawak, at 0.89 km km−2, and the lowest density in Brunei, at 0.18 km km−2. Analyzing MODIS-based tree cover maps, we estimate that logging operated within 700 m of primary logging roads. Using this distance, we estimate that 266,257 km2 of 1973 forest cover has been logged. With 389,566 km2 (52.8%) of the island remaining forested, of which 209,649 km2 remains intact. There is still hope for biodiversity conservation in Borneo. Protecting logged forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo. PMID:25029192

  3. Correlating Log Messages for System Diagnostics

    SciTech Connect

    Gunasekaran, Raghul; Dillow, David A; Shipman, Galen M; Maxwell, Don E; Hill, Jason J; Park, Byung H; Geist, Al

    2010-01-01

    In large-scale computing systems, the sheer volume of log data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand log entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of log messages. Most often, the log messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting log messages by identifying their commonalities and grouping them into clusters. Given a set of log messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre log messages in the XT5 console log and show that such grouping of log messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the log, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.

  4. Web-based guided insulin self-titration in patients with type 2 diabetes: the Di@log study. Design of a cluster randomised controlled trial [TC1316

    PubMed Central

    Roek, Mariëlle GA; Welschen, Laura MC; Kostense, Piet J; Dekker, Jacqueline M; Snoek, Frank J; Nijpels, Giel

    2009-01-01

    Background Many patients with type 2 diabetes (T2DM) are not able to reach the glycaemic target level of HbA1c < 7.0%, and therefore are at increased risk of developing severe complications. Transition to insulin therapy is one of the obstacles in diabetes management, because of barriers of both patient and health care providers. Patient empowerment, a patient-centred approach, is vital for improving diabetes management. We developed a web-based self-management programme for insulin titration in T2DM patients. The aim of our study is to investigate if this internet programme helps to improve glycaemic control more effectively than usual care. Methods/Design T2DM patients (n = 248), aged 35–75 years, with an HbA1c ≥ 7.0%, eligible for treatment with insulin and able to use the internet will be selected from general practices in two different regions in the Netherlands. Cluster randomisation will be performed at the level of general practices. Patients in the intervention group will use a self-developed internet programme to assist them in self-titrating insulin. The control group will receive usual care. Primary outcome is the difference in change in HbA1c between intervention and control group. Secondary outcome measures are quality of life, treatment satisfaction, diabetes self-efficacy and frequency of hypoglycaemic episodes. Results will be analysed according to the intention-to-treat principle. Discussion An internet intervention supporting self-titration of insulin therapy in T2DM patients is an innovative patient-centred intervention. The programme provides guided self-monitoring and evaluation of health and self-care behaviours through tailored feedback on input of glucose values. This is expected to result in a better performance of self-titration of insulin and consequently in the improvement of glycaemic control. The patient will be enabled to 'discover and use his or her own ability to gain mastery over his/her diabetes' and therefore patient

  5. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, III, William B.

    1989-01-01

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described.

  6. Dual excitation acoustic paramagnetic logging tool

    DOEpatents

    Vail, W.B. III.

    1989-02-14

    New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described. 6 figs.

  7. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )

    1996-01-01

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  8. Utilizing magnetic resonance imaging logs, openhole logs, and sidewall core analyses to evaluate shaly sands for water-free production

    SciTech Connect

    Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.

    1996-12-31

    Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.

  9. VizieR Online Data Catalog: CoRoT observation log (N2-4.4) (CoRoT 2016)

    NASA Astrophysics Data System (ADS)

    COROT Team

    2014-03-01

    CoRoT, a space astronomy mission, has measured photometric micro-variability of stars from minutes to months (up to 150 days) with a high duty cycle (more than 90%). The mission was led by CNES in association with four French laboratories and 7 participating countries and agencies (Austria, Belgium, Brazil, Germany, Spain, and the ESA Science Programme). The satellite was composed of a PROTEUS platform (the 3rd in the series) and a unique instrument: a stellar rapid photometer. It was launched on December 27th 2006 by a Soyuz Rocket, from Bakonour. The mission has lasted almost 6 years (the nominal 3-year duration and a 3-year extension) and has observed more than 160 000 stars. It stopped sending data on November 2nd 2012. Two regions of the sky were accessible for long period of time: circles of 10 degrees centered on the equator around alpha=06:50 and alpha=18:50. They were called the CoRoT eyes: the "anticenter" and the "center eye" (as they are approximately in these directions). Each pointing covers 1.4x2.8 square degrees within one of those CoRoT eyes. The original scientific objectives were focussed on the study of stellar pulsations (asteroseismology) to probe the internal structure of stars, and the detection of small exoplanets through their "transit" in front of their host star, and the measurement of their size. This lead to introduce two modes of observations, working simultaneously: - The "bright star" mode dedicated to very precise seismology of a small sample of bright and closeby stars - The "faint star" mode, observing a very large number of stars at the same time, to detect transits, which are rare events, as they imply the alignment of the star, the planet and the observer. The large amount of data gathered in this mode turned out to be extremely fruitful for many topics of stellar physics. Beyond these two initial objectives, CoRoT data revealed stellar variability associated with various other phenomena: granulation, rotational modulation by

  10. VizieR Online Data Catalog: CoRoT observation log (N2-4.4) (CoRoT 2016)

    NASA Astrophysics Data System (ADS)

    COROT Team

    2014-03-01

    CoRoT, a space astronomy mission, has measured photometric micro-variability of stars from minutes to months (up to 150 days) with a high duty cycle (more than 90%). The mission was led by CNES in association with four French laboratories and 7 participating countries and agencies (Austria, Belgium, Brazil, Germany, Spain, and the ESA Science Programme). The satellite was composed of a PROTEUS platform (the 3rd in the series) and a unique instrument: a stellar rapid photometer. It was launched on December 27th 2006 by a Soyuz Rocket, from Bakonour. The mission has lasted almost 6 years (the nominal 3-year duration and a 3-year extension) and has observed more than 160 000 stars. It stopped sending data on November 2nd 2012. Two regions of the sky were accessible for long period of time: circles of 10 degrees centered on the equator around alpha=06:50 and alpha=18:50. They were called the CoRoT eyes: the "anticenter" and the "center eye" (as they are approximately in these directions). Each pointing covers 1.4x2.8 square degrees within one of those CoRoT eyes. The original scientific objectives were focussed on the study of stellar pulsations (asteroseismology) to probe the internal structure of stars, and the detection of small exoplanets through their "transiting in front of their host star, and the measurement of their size. This lead to introduce two modes of observations, working simultaneously: - The "bright star" mode dedicated to very precise seismology of a small sample of bright and closeby stars - The "faint star" mode, observing a very large number of stars at the same time, to detect transits, which are rare events, as they imply the alignment of the star, the planet and the observer. The large amount of data gathered in this mode turned out to be extremely fruitful for many topics of stellar physics. Beyond these two initial objectives, CoRoT data revealed stellar variability associated with various other phenomena: granulation, rotational modulation

  11. Flow rate logging seepage meter

    NASA Technical Reports Server (NTRS)

    Reay, William G. (Inventor); Walthall, Harry G. (Inventor)

    1996-01-01

    An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.

  12. Optimal message log reclamation for independent checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. Kent

    1993-01-01

    Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.

  13. Log-Derived evaluation of shaly sandstone reservoirs

    SciTech Connect

    Fertl, W.H.

    1984-04-01

    Significant natural gas resources are known to exist in the United States in tight, low-permeability sandstones that cover a prospective area of 1,000,000 mi/sup 2/ (2,590,000 km/sup 2/). Characterization and reliable estimation of their production potential based on well logs are important although difficult task. Proper evaluation of low permeability sands based on conventional log-interpretation techniques is frequently inadequate. Furthermore, while empirical rules of thumb assist in the evaluation of localized conditions, they only provide guidelines. Recent developments in quantitative log-analysis techniques incorporate natural-gamma-ray spectral data and application of the Waxman-Smits model for detailed reservoir description. Quantitative correlations of cation exchange capacity (CEC), water salinity, porosity, and conductivity of water- and hydrocarbon-bearing shaly sand reservoirs are based on resistivity, density, neutron and natural-gamma-ray spectral data. These correlations provide important information about clay volume, reservoir porosities (total, effective) and fluid-saturation distribution (total, effective), type of clay minerals (smectite, illite, chlorite/kaolinite), their distribution in the reservoir (dispersed, laminated, structural), and log-derived indicators of potential formation damage. Field experiences are reviewed for logging and evaluating tight formations in south Texas; the Jurassic Cotton Valley trend in east Texas, Louisiana, and Arkansas; and the Tertiary Fort Union and Cretaceous Mesaverde Formations of the Piceance basin in Colorado.

  14. Interpretation of borehole geophysical logs, aquifer-isolation tests, and water quality, supply wells 1 and 2, Willow Grove Naval Air Station/Joint Reserve Base, Horsham Township, Montgomery County, Pennsylvania

    USGS Publications Warehouse

    Sloto, Ronald A.; Goode, Daniel J.; Frasch, Steven M.

    2002-01-01

    Ground water pumped from supply wells 1 and 2 on the Willow Grove Naval Air Station/Joint Reserve Base (NAS/JRB) provides water for use at the base, including potable water for drinking. The supply wells have been contaminated by volatile organic compounds (VOC?s), particularly trichloroethylene (TCE) and tetrachloroethylene (PCE), and the water is treated to remove the VOC?s. The Willow Grove NAS/JRB and surrounding area are underlain by sedimentary rocks of the Triassic-age Stockton Formation, which form a complex, heterogeneous aquifer. The ground-water-flow system for the supply wells was characterized by use of borehole geophysical logs and heatpulse-flowmeter measurements. The heatpulse-flowmeter measurements showed upward and downward borehole flow under nonpumping conditions in both wells. The hydraulic and chemical properties of discrete water-bearing fractures in the supply wells were characterized by isolating each water-bearing fracture with straddle packers. Eight fractures in supply well 1 and five fractures in supply well 2 were selected for testing on the basis of the borehole geophysical logs and borehole television surveys. Water samples were collected from each isolated fracture and analyzed for VOC?s and inorganic constituents. Fractures at 50?59, 79?80, 196, 124?152, 182, 241, 256, and 350?354 ft btoc (feet below top of casing) were isolated in supply well 1. Specific capacities ranged from 0.26 to 5.7 (gal/min)/ft (gallons per minute per foot) of drawdown. The highest specific capacity was for the fracture isolated at 179.8?188 ft btoc. Specific capacity and depth of fracture were not related in either supply well. The highest concentrations of PCE were in water samples collected from fractures isolated at 236.8?245 and 249.8?258 ft btoc, which are hydraulically connected. The concentration of PCE generally increased with depth to a maximum of 39 mg/L (micrograms per liter) at a depth of 249.8? 258 ft btoc and then decreased to 21 mg/L at a

  15. Geological and petrophysical characterization of the Ferron Sandstone for 3-D simulation of a fluvial-deltaic reservoir. Deliverable 2.5.4, Ferron Sandstone lithologic strip logs, Emergy & Sevier Counties, Utah: Volume I

    SciTech Connect

    Allison, M.L.

    1995-12-08

    Strip logs for 491 wells were produced from a digital subsurface database of lithologic descriptions of the Ferron Sandstone Member of the Mancos Shale. This subsurface database covers wells from the parts of Emery and Sevier Counties in central Utah that occur between Ferron Creek on the north and Last Chance Creek on the south. The lithologic descriptions were imported into a logging software application designed for the display of stratigraphic data. Strip logs were produced at a scale of one inch equals 20 feet. The strip logs were created as part of a study by the Utah Geological Survey to develop a comprehensive, interdisciplinary, and qualitative characterization of a fluvial-deltaic reservoir using the Ferron Sandstone as a surface analogue. The study was funded by the U.S. Department of Energy (DOE) under the Geoscience/Engineering Reservoir Characterization Program.

  16. Alternative methods for estimating common descriptors for QSAR studies of dyes and fluorescent probes using molecular modeling software. 2. Correlations between log P and the hydrophilic/lipophilic index, and new methods for estimating degrees of amphiphilicity.

    PubMed

    Dapson, Richard W; Horobin, Richard W

    2013-11-01

    The log P descriptor, despite its usefulness, can be difficult to use, especially for researchers lacking skills in physical chemistry. Moreover this classic measure has been determined in numerous ways, which can result in inconsistant estimates of log P values, especially for relatively complex molecules such as fluorescent probes. Novel measures of hydrophilicity/lipophilicity (the Hydrophilic/Lipophilic Index, HLI) and amphiphilicity (hydrophilic/lipophilic indices for the head group and tail, HLIT and HLIHG, respectively) therefore have been devised. We compare these descriptors with measures based on log P, the standard method for quantitative structure activity relationships (QSAR) studies. HLI can be determined using widely available molecular modeling software, coupled with simple arithmetic calculations. It is based on partial atomic charges and is intended to be a stand-alone measure of hydrophilicity/lipophilicity. Given the wide application of log P, however, we investigated the correlation between HLI and log P using a test set of 56 fluorescent probes of widely different physicochemical character. Overall correlation was poor; however, correlation of HLI and log P for probes of narrowly specified charge types, i.e., non-ionic compounds, anions, conjugated cations, or zwitterions, was excellent. Values for probes with additional nonconjugated quaternary cations, however, were less well correlated. The newly devised HLI can be divided into domain-specific descriptors, HLIT and HLIHG in amphiphilic probes. Determinations of amphiphilicity, made independently by the authors using their respective methods, showed excellent agreement. Quantifying amphiphilicity from partial log P values of the head group (head group hydrophilicity; HGH) and tail (amphiphilicity index; AI) has proved useful for understanding fluorescent probe action. The same limitations of log P apply to HGH and AI, however. The novel descriptors, HLIT and HLIHG, offer analogous advantages

  17. Calibration of two complex ecosystem models with different likelihood functions

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    The biosphere is a sensitive carbon reservoir. Terrestrial ecosystems were approximately carbon neutral during the past centuries, but they became net carbon sinks due to climate change induced environmental change and associated CO2 fertilization effect of the atmosphere. Model studies and measurements indicate that the biospheric carbon sink can saturate in the future due to ongoing climate change which can act as a positive feedback. Robustness of carbon cycle models is a key issue when trying to choose the appropriate model for decision support. The input parameters of the process-based models are decisive regarding the model output. At the same time there are several input parameters for which accurate values are hard to obtain directly from experiments or no local measurements are available. Due to the uncertainty associated with the unknown model parameters significant bias can be experienced if the model is used to simulate the carbon and nitrogen cycle components of different ecosystems. In order to improve model performance the unknown model parameters has to be estimated. We developed a multi-objective, two-step calibration method based on Bayesian approach in order to estimate the unknown parameters of PaSim and Biome-BGC models. Biome-BGC and PaSim are a widely used biogeochemical models that simulate the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems (in this research the developed version of Biome-BGC is used which is referred as BBGC MuSo). Both models were calibrated regardless the simulated processes and type of model parameters. The calibration procedure is based on the comparison of measured data with simulated results via calculating a likelihood function (degree of goodness-of-fit between simulated and measured data). In our research different likelihood function formulations were used in order to examine the effect of the different model

  18. Tolerance bounds for log gamma regression models

    NASA Technical Reports Server (NTRS)

    Jones, R. A.; Scholz, F. W.; Ossiander, M.; Shorack, G. R.

    1985-01-01

    The present procedure for finding lower confidence bounds for the quantiles of Weibull populations, on the basis of the solution of a quadratic equation, is more accurate than current Monte Carlo tables and extends to any location-scale family. It is shown that this method is accurate for all members of the log gamma(K) family, where K = 1/2 to infinity, and works well for censored data, while also extending to regression data. An even more accurate procedure involving an approximation to the Lawless (1982) conditional procedure, with numerical integrations whose tables are independent of the data, is also presented. These methods are applied to the case of failure strengths of ceramic specimens from each of three billets of Si3N4, which have undergone flexural strength testing.

  19. Learning Logs in Introductory Literature Courses

    ERIC Educational Resources Information Center

    Babcock, Matthew James

    2007-01-01

    This article explores the beneficial and sometimes unpredictable implications of a daily reflective writing exercise for introductory literature courses: the learning log. Pseudonymous samples of student writing, coupled with instructor commentary, gesture toward ways in which the learning log's continual implementation and modification foster a…

  20. Measuring Reading Instruction with Teacher Logs

    ERIC Educational Resources Information Center

    Rowan, Brian; Correnti, Richard

    2009-01-01

    The authors argue that the criticisms of their earlier article on teacher logs ("Educational Researcher," March 2009) by Smagorinsky and Willis do not address, much less undermine, the evidence they presented as part of their validation argument about the teacher logs. Moreover, they argue that their method for studying classrooms is not nearly as…

  1. 10 CFR 34.71 - Utilization logs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall...

  2. Discover Presidential Log Cabins. Teacher's Discussion Guide.

    ERIC Educational Resources Information Center

    National Park Service (Dept. of Interior), Washington, DC.

    Discover Presidential Log Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three log cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…

  3. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  4. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  5. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  6. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  7. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  8. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  9. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  10. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  11. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  12. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  13. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...

  14. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  15. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...

  16. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  17. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...

  18. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    NASA Astrophysics Data System (ADS)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7

  19. ROBUST MAXIMUM LIKELIHOOD ESTIMATION IN Q-SPACE MRI.

    PubMed

    Landman, B A; Farrell, J A D; Smith, S A; Calabresi, P A; van Zijl, P C M; Prince, J L

    2008-05-14

    Q-space imaging is an emerging diffusion weighted MR imaging technique to estimate molecular diffusion probability density functions (PDF's) without the need to assume a Gaussian distribution. We present a robust M-estimator, Q-space Estimation by Maximizing Rician Likelihood (QEMRL), for diffusion PDF's based on maximum likelihood. PDF's are modeled by constrained Gaussian mixtures. In QEMRL, robust likelihood measures mitigate the impacts of imaging artifacts. In simulation and in vivo human spinal cord, the method improves reliability of estimated PDF's and increases tissue contrast. QEMRL enables more detailed exploration of the PDF properties than prior approaches and may allow acquisitions at higher spatial resolution.

  20. Designing and Piloting a Leadership Daily Practice Log: Using Logs to Study the Practice of Leadership

    ERIC Educational Resources Information Center

    Spillane, James P.; Zuberi, Anita

    2009-01-01

    Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…

  1. Maximum-likelihood joint image reconstruction and motion estimation with misaligned attenuation in TOF-PET/CT

    NASA Astrophysics Data System (ADS)

    Bousse, Alexandre; Bertolli, Ottavia; Atkinson, David; Arridge, Simon; Ourselin, Sébastien; Hutton, Brian F.; Thielemans, Kris

    2016-02-01

    This work is an extension of our recent work on joint activity reconstruction/motion estimation (JRM) from positron emission tomography (PET) data. We performed JRM by maximization of the penalized log-likelihood in which the probabilistic model assumes that the same motion field affects both the activity distribution and the attenuation map. Our previous results showed that JRM can successfully reconstruct the activity distribution when the attenuation map is misaligned with the PET data, but converges slowly due to the significant cross-talk in the likelihood. In this paper, we utilize time-of-flight PET for JRM and demonstrate that the convergence speed is significantly improved compared to JRM with conventional PET data.

  2. Maximum-likelihood joint image reconstruction and motion estimation with misaligned attenuation in TOF-PET/CT.

    PubMed

    Bousse, Alexandre; Bertolli, Ottavia; Atkinson, David; Arridge, Simon; Ourselin, Sébastien; Hutton, Brian F; Thielemans, Kris

    2016-02-01

    This work is an extension of our recent work on joint activity reconstruction/motion estimation (JRM) from positron emission tomography (PET) data. We performed JRM by maximization of the penalized log-likelihood in which the probabilistic model assumes that the same motion field affects both the activity distribution and the attenuation map. Our previous results showed that JRM can successfully reconstruct the activity distribution when the attenuation map is misaligned with the PET data, but converges slowly due to the significant cross-talk in the likelihood. In this paper, we utilize time-of-flight PET for JRM and demonstrate that the convergence speed is significantly improved compared to JRM with conventional PET data.

  3. Log-Gabor Weber descriptor for face recognition

    NASA Astrophysics Data System (ADS)

    Li, Jing; Sang, Nong; Gao, Changxin

    2015-09-01

    The Log-Gabor transform, which is suitable for analyzing gradually changing data such as in iris and face images, has been widely used in image processing, pattern recognition, and computer vision. In most cases, only the magnitude or phase information of the Log-Gabor transform is considered. However, the complementary effect taken by combining magnitude and phase information simultaneously for an image-feature extraction problem has not been systematically explored in the existing works. We propose a local image descriptor for face recognition, called Log-Gabor Weber descriptor (LGWD). The novelty of our LGWD is twofold: (1) to fully utilize the information from the magnitude or phase feature of multiscale and orientation Log-Gabor transform, we apply the Weber local binary pattern operator to each transform response. (2) The encoded Log-Gabor magnitude and phase information are fused at the feature level by utilizing kernel canonical correlation analysis strategy, considering that feature level information fusion is effective when the modalities are correlated. Experimental results on the AR, Extended Yale B, and UMIST face databases, compared with those available from recent experiments reported in the literature, show that our descriptor yields a better performance than state-of-the art methods.

  4. Likelihood ratio meta-analysis: New motivation and approach for an old method.

    PubMed

    Dormuth, Colin R; Filion, Kristian B; Platt, Robert W

    2016-03-01

    A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience.

  5. Logs of Paleoseismic Excavations Across the Central Range Fault, Trinidad

    USGS Publications Warehouse

    Crosby, Christopher J.; Prentice, Carol S.; Weber, John; Ragona, Daniel

    2009-01-01

    Table 1 are shown in red on the trench logs. All radiocarbon ages are calibrated and given with 2 standard deviation age ranges. Our studies suggest that the Central Range Fault is a Holocene fault capable of producing damaging earthquakes in Trinidad

  6. Weibull distribution based on maximum likelihood with interval inspection data

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.

    1985-01-01

    The two Weibull parameters based upon the method of maximum likelihood are determined. The test data used were failures observed at inspection intervals. The application was the reliability analysis of the SSME oxidizer turbine blades.

  7. Maximum likelihood estimation of finite mixture model for economic data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  8. Low-complexity approximations to maximum likelihood MPSK modulation classification

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2004-01-01

    We present a new approximation to the maximum likelihood classifier to discriminate between M-ary and M'-ary phase-shift-keying transmitted on an additive white Gaussian noise (AWGN) channel and received noncoherentl, partially coherently, or coherently.

  9. Improved maximum likelihood reconstruction of complex multi-generational pedigrees.

    PubMed

    Sheehan, Nuala A; Bartlett, Mark; Cussens, James

    2014-11-01

    The reconstruction of pedigrees from genetic marker data is relevant to a wide range of applications. Likelihood-based approaches aim to find the pedigree structure that gives the highest probability to the observed data. Existing methods either entail an exhaustive search and are hence restricted to small numbers of individuals, or they take a more heuristic approach and deliver a solution that will probably have high likelihood but is not guaranteed to be optimal. By encoding the pedigree learning problem as an integer linear program we can exploit efficient optimisation algorithms to construct pedigrees guaranteed to have maximal likelihood for the standard situation where we have complete marker data at unlinked loci and segregation of genes from parents to offspring is Mendelian. Previous work demonstrated efficient reconstruction of pedigrees of up to about 100 individuals. The modified method that we present here is not so restricted: we demonstrate its applicability with simulated data on a real human pedigree structure of over 1600 individuals. It also compares well with a very competitive approximate approach in terms of solving time and accuracy. In addition to identifying a maximum likelihood pedigree, we can obtain any number of pedigrees in decreasing order of likelihood. This is useful for assessing the uncertainty of a maximum likelihood solution and permits model averaging over high likelihood pedigrees when this would be appropriate. More importantly, when the solution is not unique, as will often be the case for large pedigrees, it enables investigation into the properties of maximum likelihood pedigree estimates which has not been possible up to now. Crucially, we also have a means of assessing the behaviour of other approximate approaches which all aim to find a maximum likelihood solution. Our approach hence allows us to properly address the question of whether a reasonably high likelihood solution that is easy to obtain is practically as

  10. A dual formulation of a penalized maximum likelihood x-ray CT reconstruction problem

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Taguchi, Katsuyuki; Gullberg, Grant T.; Tsui, Benjamin M. W.

    2009-02-01

    This work studies the dual formulation of a penalized maximum likelihood reconstruction problem in x-ray CT. The primal objective function is a Poisson log-likelihood combined with a weighted cross-entropy penalty term. The dual formulation of the primal optimization problem is then derived and the optimization procedure outlined. The dual formulation better exploits the structure of the problem, which translates to faster convergence of iterative reconstruction algorithms. A gradient descent algorithm is implemented for solving the dual problem and its performance is compared with the filtered back-projection algorithm, and with the primal formulation optimized by using surrogate functions. The 3D XCAT phantom and an analytical x-ray CT simulator are used to generate noise-free and noisy CT projection data set with monochromatic and polychromatic x-ray spectrums. The reconstructed images from the dual formulation delineate the internal structures at early iterations better than the primal formulation using surrogate functions. However the body contour is slower to converge in the dual than in the primal formulation. The dual formulation demonstrate better noise-resolution tradeoff near the internal organs than the primal formulation. Since the surrogate functions in general can provide a diagonal approximation of the Hessian matrix of the objective function, further convergence speed up may be achieved by deriving the surrogate function of the dual objective function.

  11. A notion of graph likelihood and an infinite monkey theorem

    NASA Astrophysics Data System (ADS)

    Banerji, Christopher R. S.; Mansour, Toufik; Severini, Simone

    2014-01-01

    We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.

  12. Sedimentological analysis using geophysical well logs

    SciTech Connect

    Izotova, T.S. )

    1993-09-01

    The application of geophysical well logs in sedimentology and stratigraphic prospecting holds great promise in solving a number of geological problems. A suite of logs provides data on a wide range of rock properties: vertical and lateral variation of resistivity, natural polarization, natural and induced radioactivity, shear strength, and acoustic properties. Each of these properties is controlled by the depositional environment of the sediments and their later diagenesis. The attention of geologists and geophysicists is drawn to new techniques in the interpretation of geophysical well logs for exploration, appraisal, and development of oil and gas fields. The relationship between geophysical logs and depositional environments is explored. Bulk composition, rock structure, and texture and facies variation can be quantified by electric log parameters. Also, the possibility of using logs to demonstrate long- and short-period sedimentary cycles is demonstrated. Methods of sedimentological analysis using geophysical well logs are demonstrated. The importance of a genetic approach in the interpretation of geological sequences and paleogeological reconstructions is emphasized using examples taken from oil and gas prospecting operations in the Ukraine.

  13. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    SciTech Connect

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.; Næss, S. K.; Seljebotn, D. S.; Górski, K. M.; Huey, G.; Jewell, J. B.; Rocha, G.; Wehus, I. K.

    2013-11-10

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expression that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.

  14. Parametric likelihood inference for interval censored competing risks data

    PubMed Central

    Hudgens, Michael G.; Li, Chenxi

    2014-01-01

    Summary Parametric estimation of the cumulative incidence function (CIF) is considered for competing risks data subject to interval censoring. Existing parametric models of the CIF for right censored competing risks data are adapted to the general case of interval censoring. Maximum likelihood estimators for the CIF are considered under the assumed models, extending earlier work on nonparametric estimation. A simple naive likelihood estimator is also considered that utilizes only part of the observed data. The naive estimator enables separate estimation of models for each cause, unlike full maximum likelihood in which all models are fit simultaneously. The naive likelihood is shown to be valid under mixed case interval censoring, but not under an independent inspection process model, in contrast with full maximum likelihood which is valid under both interval censoring models. In simulations, the naive estimator is shown to perform well and yield comparable efficiency to the full likelihood estimator in some settings. The methods are applied to data from a large, recent randomized clinical trial for the prevention of mother-to-child transmission of HIV. PMID:24400873

  15. Recognizing Patterns In Log-Polar Coordinates

    NASA Technical Reports Server (NTRS)

    Weiman, Carl F. R.

    1992-01-01

    Log-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to log-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. Log-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.

  16. Nonblocking and orphan free message logging protocols

    NASA Technical Reports Server (NTRS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-01-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  17. Logged On for Learning. An "Education Week" Special Report.

    ERIC Educational Resources Information Center

    West, Peter

    1995-01-01

    This document and accompanying disk contain all of the articles from an "Education Week" special report examining the complex phenomenon commonly known as the "information highway" and how it affects the nation's schools. The articles, all by Peter West, include: (1) "Logged On for Learning," an overview of the report; (2) "Wired for the Future,"…

  18. 14 CFR 60.20 - Logging FSTD discrepancies.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...

  19. Maximum likelihood resampling of noisy, spatially correlated data

    NASA Astrophysics Data System (ADS)

    Goff, J.; Jenkins, C.

    2005-12-01

    In any geologic application, noisy data are sources of consternation for researchers, inhibiting interpretability and marring images with unsightly and unrealistic artifacts. Filtering is the typical solution to dealing with noisy data. However, filtering commonly suffers from ad hoc (i.e., uncalibrated, ungoverned) application, which runs the risk of erasing high variability components of the field in addition to the noise components. We present here an alternative to filtering: a newly developed methodology for correcting noise in data by finding the "best" value given the data value, its uncertainty, and the data values and uncertainties at proximal locations. The motivating rationale is that data points that are close to each other in space cannot differ by "too much", where how much is "too much" is governed by the field correlation properties. Data with large uncertainties will frequently violate this condition, and in such cases need to be corrected, or "resampled." The best solution for resampling is determined by the maximum of the likelihood function defined by the intersection of two probability density functions (pdf): (1) the data pdf, with mean and variance determined by the data value and square uncertainty, respectively, and (2) the geostatistical pdf, whose mean and variance are determined by the kriging algorithm applied to proximal data values. A Monte Carlo sampling of the data probability space eliminates non-uniqueness, and weights the solution toward data values with lower uncertainties. A test with a synthetic data set sampled from a known field demonstrates quantitatively and qualitatively the improvement provided by the maximum likelihood resampling algorithm. The method is also applied to three marine geology/geophysics data examples: (1) three generations of bathymetric data on the New Jersey shelf with disparate data uncertainties; (2) mean grain size data from the Adriatic Sea, which is combination of both analytic (low uncertainty

  20. On Muthen's Maximum Likelihood for Two-Level Covariance Structure Models

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro

    2005-01-01

    Data in social and behavioral sciences are often hierarchically organized. Special statistical procedures that take into account the dependence of such observations have been developed. Among procedures for 2-level covariance structure analysis, Muthen's maximum likelihood (MUML) has the advantage of easier computation and faster convergence. When…

  1. Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures

    ERIC Educational Resources Information Center

    Atar, Burcu; Kamata, Akihito

    2011-01-01

    The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…

  2. Out-of-atlas likelihood estimation using multi-atlas segmentation

    PubMed Central

    Asman, Andrew J.; Chambless, Lola B.; Thompson, Reid C.; Landman, Bennett A.

    2013-01-01

    Purpose: Multi-atlas segmentation has been shown to be highly robust and accurate across an extraordinary range of potential applications. However, it is limited to the segmentation of structures that are anatomically consistent across a large population of potential target subjects (i.e., multi-atlas segmentation is limited to “in-atlas” applications). Herein, the authors propose a technique to determine the likelihood that a multi-atlas segmentation estimate is representative of the problem at hand, and, therefore, identify anomalous regions that are not well represented within the atlases. Methods: The authors derive a technique to estimate the out-of-atlas (OOA) likelihood for every voxel in the target image. These estimated likelihoods can be used to determine and localize the probability of an abnormality being present on the target image. Results: Using a collection of manually labeled whole-brain datasets, the authors demonstrate the efficacy of the proposed framework on two distinct applications. First, the authors demonstrate the ability to accurately and robustly detect malignant gliomas in the human brain—an aggressive class of central nervous system neoplasms. Second, the authors demonstrate how this OOA likelihood estimation process can be used within a quality control context for diffusion tensor imaging datasets to detect large-scale imaging artifacts (e.g., aliasing and image shading). Conclusions: The proposed OOA likelihood estimation framework shows great promise for robust and rapid identification of brain abnormalities and imaging artifacts using only weak dependencies on anomaly morphometry and appearance. The authors envision that this approach would allow for application-specific algorithms to focus directly on regions of high OOA likelihood, which would (1) reduce the need for human intervention, and (2) reduce the propensity for false positives. Using the dual perspective, this technique would allow for algorithms to focus on

  3. Analysis of neighborhood dynamics of forest ecosystems using likelihood methods and modeling.

    PubMed

    Canham, Charles D; Uriarte, María

    2006-02-01

    Advances in computing power in the past 20 years have led to a proliferation of spatially explicit, individual-based models of population and ecosystem dynamics. In forest ecosystems, the individual-based models encapsulate an emerging theory of "neighborhood" dynamics, in which fine-scale spatial interactions regulate the demography of component tree species. The spatial distribution of component species, in turn, regulates spatial variation in a whole host of community and ecosystem properties, with subsequent feedbacks on component species. The development of these models has been facilitated by development of new methods of analysis of field data, in which critical demographic rates and ecosystem processes are analyzed in terms of the spatial distributions of neighboring trees and physical environmental factors. The analyses are based on likelihood methods and information theory, and they allow a tight linkage between the models and explicit parameterization of the models from field data. Maximum likelihood methods have a long history of use for point and interval estimation in statistics. In contrast, likelihood principles have only more gradually emerged in ecology as the foundation for an alternative to traditional hypothesis testing. The alternative framework stresses the process of identifying and selecting among competing models, or in the simplest case, among competing point estimates of a parameter of a model. There are four general steps involved in a likelihood analysis: (1) model specification, (2) parameter estimation using maximum likelihood methods, (3) model comparison, and (4) model evaluation. Our goal in this paper is to review recent developments in the use of likelihood methods and modeling for the analysis of neighborhood processes in forest ecosystems. We will focus on a single class of processes, seed dispersal and seedling dispersion, because recent papers provide compelling evidence of the potential power of the approach, and illustrate

  4. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  5. A NEW LOG EVALUATION METHOD TO APPRAISE MESAVERDE RE-COMPLETION OPPORTUNITES

    SciTech Connect

    Albert Greer

    2002-09-11

    This report covers the information presented at the technical progress review. The following topics were discussed: (1) Structure of the study area dataset. (2) A visual illustration of the typical logs. (3) Crossplots developed and evaluated for patterns that might bear a relationship to initial producing rate. (4) Development of fuzzy curves to rank the goodness of the statistical properties of the logs used to construct the crossplots. (5) Neural network correlations developed with statistical properties of the logs.

  6. Maximum likelihood estimation for cytogenetic dose-response curves

    SciTech Connect

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  7. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-01-30

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  8. Logging-while-coring method and apparatus

    DOEpatents

    Goldberg, David S.; Myers, Gregory J.

    2007-11-13

    A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.

  9. Optimal message log reclamation for uncoordinated checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. K.

    1994-01-01

    Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.

  10. Sisyphus - An Event Log Analysis Toolset

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less

  11. RIAPROG: logit-log radioimmunoassay data processing

    SciTech Connect

    Faden, V.B.; Huston, J.C. Jr.; Munson, P.J.; Rodbard, D.

    1980-03-01

    This document provides the users guide and program listings for the 1980 version of RIAPROG. RIAPROG is a program for weighted least squares linear regression using the logit-log model for analysis of RIA results.

  12. Expansion of industrial logging in Central Africa.

    PubMed

    Laporte, Nadine T; Stabach, Jared A; Grosch, Robert; Lin, Tiffany S; Goetz, Scott J

    2007-06-01

    Industrial logging has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new logging roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of logging expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively logged areas. PMID:17556578

  13. Applying time series analysis to performance logs

    NASA Astrophysics Data System (ADS)

    Kubacki, Marcin; Sosnowski, Janusz

    2015-09-01

    Contemporary computer systems provide mechanisms for monitoring various performance parameters (e.g. processor or memory usage, disc or network transfers), which are collected and stored in performance logs. An important issue is to derive characteristic features describing normal and abnormal behavior of the systems. For this purpose we use various schemes of analyzing time series. They have been adapted to the specificity of performance logs and verified using data collected from real systems. The presented approach is useful in evaluating system dependability.

  14. DOE/Simplec magnetic susceptibility logging system

    SciTech Connect

    Emilia, D.A.; Allen, J.W.; Chessmore, R.B.; Wilson, R.B.

    1981-03-01

    A magnetic susceptibility logging system has been developed which is relatively stable under normal field logging conditions and which produces logs that accurately represent in situ variations in magnetic susceptibility. However, both field and laboratory tests indicate the need for further improvement of temperature stabilization and bridge compensation; a new generation system designed by Simplec may fill that need. A cubic granite block with a magnetic susceptibility of 385 ..mu..CGS is designated as the primary calibration standard and is available for public use at the DOE facility in Grand Junction, Colorado. Models are also available for characterization of magnetic susceptibility systems. These include models to provide borehole size correction factors, thin layer interpretation parameters, reproducibility limits, longitudinal resolution, and radius of investigation. The DOE/Simplec system has a 99-percent radius of investigation, approximately equal to the coil length (15 inches), and a 99-percent thickness of investigation, approximately equal to two coil lengths. The true magnetic susceptibility and thickness of isolated layers less than two coil lengths in thickness can be obtained through use of parameters measured from their log responses. Field tests show that the system has a reproducibility of at least 5 ..mu..CGS and that logging at 25 ft/min is a good compromise between speed of operation and keeping the probe on the sidewall. Comparison of log and core magnetic susceptibility measurements confirms the fact that the logging system samples a rather large volume and that interpretive aids are necessary to unfold the true variation of magnetic susceptibility with depth. Finally, logs from known uranium-favorable environments show that magnetic susceptibility measurements can give an indication of the degree of geochemical alteration, which is one of the uranium-favorable haloes sought by exploration geologists.

  15. 32 CFR 700.845 - Maintenance of logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Maintenance of logs. 700.845 Section 700.845... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...

  16. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  17. Web Logs in the English Classroom: More Than Just Chat.

    ERIC Educational Resources Information Center

    Richardson, Will

    2003-01-01

    Details the use and appeal of Web logs to enhance classroom discussion and allow for outside involvement in the classroom. Defines a Web log, addresses discussing literature in a Web log, and describes the author's first attempts at using Web-log technology. Presents considerations for using Web logs as part of classroom instruction. (SG)

  18. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  19. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  20. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  1. 29 CFR 42.7 - Complaint/directed action logs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...

  2. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  3. CLARET user's manual: Mainframe Logs. Revision 1

    SciTech Connect

    Frobose, R.H.

    1984-11-12

    CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is a dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.

  4. The Likelihood of Recent Record Warmth

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.; Rahmstorf, Stefan; Steinman, Byron A.; Tingley, Martin; Miller, Sonya K.

    2016-01-01

    2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,2. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing.

  5. The Likelihood of Recent Record Warmth

    PubMed Central

    Mann, Michael E.; Rahmstorf, Stefan; Steinman, Byron A.; Tingley, Martin; Miller, Sonya K.

    2016-01-01

    2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,2. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing. PMID:26806092

  6. Race of source effects in the elaboration likelihood model.

    PubMed

    White, P H; Harkins, S G

    1994-11-01

    In a series of experiments, we investigated the effect of race of source on persuasive communications in the Elaboration Likelihood Model (R.E. Petty & J.T. Cacioppo, 1981, 1986). In Experiment 1, we found no evidence that White participants responded to a Black source as a simple negative cue. Experiment 2 suggested the possibility that exposure to a Black source led to low-involvement message processing. In Experiments 3 and 4, a distraction paradigm was used to test this possibility, and it was found that participants under low involvement were highly motivated to process a message presented by a Black source. In Experiment 5, we found that attitudes toward the source's ethnic group, rather than violations of expectancies, accounted for this processing effect. Taken together, the results of these experiments are consistent with S.L. Gaertner and J.F. Dovidio's (1986) theory of aversive racism, which suggests that Whites, because of a combination of egalitarian values and underlying negative racial attitudes, are very concerned about not appearing unfavorable toward Blacks, leading them to be highly motivated to process messages presented by a source from this group.

  7. Likelihood of nitrogen condensation in Titan's present-day atmosphere

    NASA Astrophysics Data System (ADS)

    Tokano, Tetsuya

    2016-04-01

    The temperature in Titan's upper troposphere measured by the Huygens Probe is relatively close to the nitrogen condensation point. This poses the question as to whether seasonal nitrogen condensation might occur on Titan analogously to seasonal carbon dioxide condensation on Mars. The likelihood of nitrogen condensation in Titan's atmosphere is investigated using tropospheric temperature data obtained by Cassini radio occultations and simulation with a general circulation model (GCM). The observed tropospheric temperature generally decreases towards both poles but does not reach the nitrogen condensation point anywhere. However, Cassini may not have sounded the coldest season and area in Titan's troposphere. The GCM simulation shows that in the upper troposphere the variable solar distance associated with Saturn's orbital eccentricity has a larger impact on the seasonal polar temperature variation than the variable solar declination associated with Saturn's obliquity. In the upper troposphere relevant for nitrogen condensation the annual minimum polar temperature is predicted to occur around the northern autumnal equinox, approximately one season after aphelion. This temperature is then 1-2 K lower than in the season of the Cassini/Huygens mission. It is possible if not certain that some nitrogen condensation with cloud formation occurs in the northern and southern polar region in the upper troposphere around the northern autumnal equinox. Under the present orbital parameters of Saturn and Titan nitrogen condensation may occur more frequently near the south pole than near the north pole.

  8. Carbon emissions performance of commercial logging in East Kalimantan, Indonesia.

    PubMed

    Griscom, Bronson; Ellis, Peter; Putz, Francis E

    2014-03-01

    Adoption of reduced-impact logging (RIL) methods could reduce CO2 emissions by 30-50% across at least 20% of remaining tropical forests. We developed two cost effective and robust indices for comparing the climate benefits (reduced CO2 emissions) due to RIL. The indices correct for variability in the volume of commercial timber among concessions. We determined that a correction for variability in terrain slope was not needed. We found that concessions certified by the Forest Stewardship Council (FSC, N = 3), when compared with noncertified concessions (N = 6), did not have lower overall CO2 emissions from logging activity (felling, skidding, and hauling). On the other hand, FSC certified concessions did have lower emissions from one type of logging impact (skidding), and we found evidence of a range of improved practices using other field metrics. One explanation of these results may be that FSC criteria and indicators, and associated RIL practices, were not designed to achieve overall emissions reductions. Also, commonly used field metrics are not reliable proxies for overall logging emissions performance. Furthermore, the simple distinction between certified and noncertified concessions does not fully represent the complex history of investments in improved logging practices. To clarify the relationship between RIL and emissions reductions, we propose the more explicit term 'RIL-C' to refer to the subset of RIL practices that can be defined by quantified thresholds and that result in measurable emissions reductions. If tropical forest certification is to be linked with CO2 emissions reductions, certification standards need to explicitly require RIL-C practices.

  9. MARGINAL EMPIRICAL LIKELIHOOD AND SURE INDEPENDENCE FEATURE SCREENING

    PubMed Central

    Chang, Jinyuan; Tang, Cheng Yong; Wu, Yichao

    2013-01-01

    We study a marginal empirical likelihood approach in scenarios when the number of variables grows exponentially with the sample size. The marginal empirical likelihood ratios as functions of the parameters of interest are systematically examined, and we find that the marginal empirical likelihood ratio evaluated at zero can be used to differentiate whether an explanatory variable is contributing to a response variable or not. Based on this finding, we propose a unified feature screening procedure for linear models and the generalized linear models. Different from most existing feature screening approaches that rely on the magnitudes of some marginal estimators to identify true signals, the proposed screening approach is capable of further incorporating the level of uncertainties of such estimators. Such a merit inherits the self-studentization property of the empirical likelihood approach, and extends the insights of existing feature screening methods. Moreover, we show that our screening approach is less restrictive to distributional assumptions, and can be conveniently adapted to be applied in a broad range of scenarios such as models specified using general moment conditions. Our theoretical results and extensive numerical examples by simulations and data analysis demonstrate the merits of the marginal empirical likelihood approach. PMID:24415808

  10. A Generalized, Likelihood-Free Method for Posterior Estimation

    PubMed Central

    Turner, Brandon M.; Sederberg, Per B.

    2014-01-01

    Recent advancements in Bayesian modeling have allowed for likelihood-free posterior estimation. Such estimation techniques are crucial to the understanding of simulation-based models, whose likelihood functions may be difficult or even impossible to derive. However, current approaches are limited by their dependence on sufficient statistics and/or tolerance thresholds. In this article, we provide a new approach that requires no summary statistics, error terms, or thresholds, and is generalizable to all models in psychology that can be simulated. We use our algorithm to fit a variety of cognitive models with known likelihood functions to ensure the accuracy of our approach. We then apply our method to two real-world examples to illustrate the types of complex problems our method solves. In the first example, we fit an error-correcting criterion model of signal detection, whose criterion dynamically adjusts after every trial. We then fit two models of choice response time to experimental data: the Linear Ballistic Accumulator model, which has a known likelihood, and the Leaky Competing Accumulator model whose likelihood is intractable. The estimated posterior distributions of the two models allow for direct parameter interpretation and model comparison by means of conventional Bayesian statistics – a feat that was not previously possible. PMID:24258272

  11. Constraint likelihood analysis for a network of gravitational wave detectors

    SciTech Connect

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-12-15

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method.

  12. One-step Sparse Estimates in Nonconcave Penalized Likelihood Models.

    PubMed

    Zou, Hui; Li, Runze

    2008-08-01

    Fan & Li (2001) propose a family of variable selection methods via penalized likelihood using concave penalty functions. The nonconcave penalized likelihood estimators enjoy the oracle properties, but maximizing the penalized likelihood function is computationally challenging, because the objective function is nondifferentiable and nonconcave. In this article we propose a new unified algorithm based on the local linear approximation (LLA) for maximizing the penalized likelihood for a broad class of concave penalty functions. Convergence and other theoretical properties of the LLA algorithm are established. A distinguished feature of the LLA algorithm is that at each LLA step, the LLA estimator can naturally adopt a sparse representation. Thus we suggest using the one-step LLA estimator from the LLA algorithm as the final estimates. Statistically, we show that if the regularization parameter is appropriately chosen, the one-step LLA estimates enjoy the oracle properties with good initial estimators. Computationally, the one-step LLA estimation methods dramatically reduce the computational cost in maximizing the nonconcave penalized likelihood. We conduct some Monte Carlo simulation to assess the finite sample performance of the one-step sparse estimation methods. The results are very encouraging.

  13. Efficient Strategies for Calculating Blockwise Likelihoods Under the Coalescent.

    PubMed

    Lohse, Konrad; Chmelik, Martin; Martin, Simon H; Barton, Nicholas H

    2016-02-01

    The inference of demographic history from genome data is hindered by a lack of efficient computational approaches. In particular, it has proved difficult to exploit the information contained in the distribution of genealogies across the genome. We have previously shown that the generating function (GF) of genealogies can be used to analytically compute likelihoods of demographic models from configurations of mutations in short sequence blocks (Lohse et al. 2011). Although the GF has a simple, recursive form, the size of such likelihood calculations explodes quickly with the number of individuals and applications of this framework have so far been mainly limited to small samples (pairs and triplets) for which the GF can be written by hand. Here we investigate several strategies for exploiting the inherent symmetries of the coalescent. In particular, we show that the GF of genealogies can be decomposed into a set of equivalence classes that allows likelihood calculations from nontrivial samples. Using this strategy, we automated blockwise likelihood calculations for a general set of demographic scenarios in Mathematica. These histories may involve population size changes, continuous migration, discrete divergence, and admixture between multiple populations. To give a concrete example, we calculate the likelihood for a model of isolation with migration (IM), assuming two diploid samples without phase and outgroup information. We demonstrate the new inference scheme with an analysis of two individual butterfly genomes from the sister species Heliconius melpomene rosina and H. cydno. PMID:26715666

  14. Exclusion probabilities and likelihood ratios with applications to kinship problems.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2014-05-01

    In forensic genetics, DNA profiles are compared in order to make inferences, paternity cases being a standard example. The statistical evidence can be summarized and reported in several ways. For example, in a paternity case, the likelihood ratio (LR) and the probability of not excluding a random man as father (RMNE) are two common summary statistics. There has been a long debate on the merits of the two statistics, also in the context of DNA mixture interpretation, and no general consensus has been reached. In this paper, we show that the RMNE is a certain weighted average of inverse likelihood ratios. This is true in any forensic context. We show that the likelihood ratio in favor of the correct hypothesis is, in expectation, bigger than the reciprocal of the RMNE probability. However, with the exception of pathological cases, it is also possible to obtain smaller likelihood ratios. We illustrate this result for paternity cases. Moreover, some theoretical properties of the likelihood ratio for a large class of general pairwise kinship cases, including expected value and variance, are derived. The practical implications of the findings are discussed and exemplified.

  15. Efficient Strategies for Calculating Blockwise Likelihoods Under the Coalescent

    PubMed Central

    Lohse, Konrad; Chmelik, Martin; Martin, Simon H.; Barton, Nicholas H.

    2016-01-01

    The inference of demographic history from genome data is hindered by a lack of efficient computational approaches. In particular, it has proved difficult to exploit the information contained in the distribution of genealogies across the genome. We have previously shown that the generating function (GF) of genealogies can be used to analytically compute likelihoods of demographic models from configurations of mutations in short sequence blocks (Lohse et al. 2011). Although the GF has a simple, recursive form, the size of such likelihood calculations explodes quickly with the number of individuals and applications of this framework have so far been mainly limited to small samples (pairs and triplets) for which the GF can be written by hand. Here we investigate several strategies for exploiting the inherent symmetries of the coalescent. In particular, we show that the GF of genealogies can be decomposed into a set of equivalence classes that allows likelihood calculations from nontrivial samples. Using this strategy, we automated blockwise likelihood calculations for a general set of demographic scenarios in Mathematica. These histories may involve population size changes, continuous migration, discrete divergence, and admixture between multiple populations. To give a concrete example, we calculate the likelihood for a model of isolation with migration (IM), assuming two diploid samples without phase and outgroup information. We demonstrate the new inference scheme with an analysis of two individual butterfly genomes from the sister species Heliconius melpomene rosina and H. cydno. PMID:26715666

  16. Integrating borehole logs and aquifer tests in aquifer characterization

    USGS Publications Warehouse

    Paillet, Frederick L.; Reese, R.S.

    2000-01-01

    Integration of lithologic logs, geophysical logs, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of logs and aquifer tests in the development of a hydrostratigraphic model for the surficial aquifer system in and around Big Cypress National Preserve in eastern Collier County, Florida. Borehole flowmeter tests provide qualitative permeability profiles in most of 26 boreholes drilled in the Study area. Flow logs indicate the depth of transmissive units, which are correlated across the study area. Comparison to published studies in adjacent areas indicates that the main limestone aquifer of the 000000Tamiami Formation in the study area corresponds with the gray limestone aquifer in western Dade County and the water table and lower Tamiami Aquifer in western Collier County. Four strategically located, multiwell aquifer tests are used to quantify the qualitative permeability profiles provided by the flowmeter log analysis. The hydrostratigraphic model based on these results defines the main aquifer in the central part of the study area as unconfined to semiconfined with a transmissivity as high as 30,000 m2/day. The aquifer decreases in transmissivity to less than 10,000 m2/day in some parts of western Collier County, and becomes confined to the east and northeast of the study area, where transmissivity decreases to below 5000 m2/day.Integration of lithologic logs, geophysical logs, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of logs and aquifer tests in the development of a hydrostratigraphic model for the

  17. Discriminating between Weibull distributions and log-normal distributions emerging in branching processes

    NASA Astrophysics Data System (ADS)

    Goh, Segun; Kwon, H. W.; Choi, M. Y.

    2014-06-01

    We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

  18. Well log characterization of natural gas hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2011-01-01

    In the last 25 years we have seen significant advancements in the use of downhole well logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From an early start of using wireline electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells drilled in Arctic permafrost environments to today where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. The most established and well known use of downhole log data in gas hydrate research is the use of electrical resistivity and acoustic velocity data (both compressional- and shear-wave data) to make estimates of gas hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. New downhole logging tools designed to make directionally oriented acoustic and propagation resistivity log measurements have provided the data needed to analyze the acoustic and electrical anisotropic properties of both highly inter-bedded and fracture dominated gas hydrate reservoirs. Advancements in nuclear-magnetic-resonance (NMR) logging and wireline formation testing have also allowed for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids (i.e., free-water along with clay and capillary bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas hydrate reservoir properties (i.e., permeabilities) needed to accurately predict gas production rates for various gas hydrate

  19. Factors Influencing the Intended Likelihood of Exposing Sexual Infidelity.

    PubMed

    Kruger, Daniel J; Fisher, Maryanne L; Fitzgerald, Carey J

    2015-08-01

    There is a considerable body of literature on infidelity within romantic relationships. However, there is a gap in the scientific literature on factors influencing the likelihood of uninvolved individuals exposing sexual infidelity. Therefore, we devised an exploratory study examining a wide range of potentially relevant factors. Based in part on evolutionary theory, we anticipated nine potential domains or types of influences on the likelihoods of exposing or protecting cheaters, including kinship, strong social alliances, financial support, previous relationship behaviors (including infidelity and abuse), potential relationship transitions, stronger sexual and emotional aspects of the extra-pair relationship, and disease risk. The pattern of results supported these predictions (N = 159 men, 328 women). In addition, there appeared to be a small positive bias for participants to report infidelity when provided with any additional information about the situation. Overall, this study contributes a broad initial description of factors influencing the predicted likelihood of exposing sexual infidelity and encourages further studies in this area.

  20. Maximum-likelihood estimation of gene location by linkage disequilibrium

    SciTech Connect

    Hill, W.G. ); Weir, B.S. )

    1994-04-01

    Linkage disequilibrium, D, between a polymorphic disease and mapped markers can, in principle, be used to help find the map position of the disease gene. Likelihoods are therefore derived for the value of D conditional on the observed number of haplotypes in the sample and on the population parameter Nc, where N is the effective population size and c the recombination fraction between the disease and marker loci. The likelihood is computed explicitly for the case of two loci with heterozygote superiority and, more generally, by computer simulations assuming a steady state of constant population size and selective pressures or neutrality. It is found that the likelihood is, in general, not very dependent on the degree of selection at the loci and is very flat. This suggests that precise information on map position will not be obtained from estimates of linkage disequilibrium. 15 refs., 5 figs., 21 tabs.

  1. Forecasting populations of undiscovered oil fields with the log-Pareto distribution

    SciTech Connect

    Bettini, C.

    1987-01-01

    The search for a statistical representation of parent populations of oil field sizes in sedimentary basins has yielded a new probability distribution, termed the log-Pareto distribution. The log-Pareto law, related to the log-Pareto distribution, describes the medians of oil field sizes (either volumes or areas) associated with size-ranks in a parent population of fields. The name proposed for the new distribution stems from the assumption that an algebraic transformation involving logarithms of the volumes (or areas) of a parent population of oil fields yields a frequency distribution of transformed sizes that accords with the Pareto distribution. The derivation of the log-Pareto law and the log-Pareto distribution involved two steps, namely, (1) simulating a parent population of oil fields such that the population is conditioned to both the proportion of a basin's area occupied by fields and their geographic variability, and (2) finding a mathematical function that fits the log-log plot of field ranks versus simulated field areas. Both the log-Pareto law and log-Pareto distribution are useful for estimating the size distribution of the oil fields that remain to be discovered in a basin. A log-Pareto law fitted to the range of the largest fields discovered in maturely explored basins tends to underestimate the subpopulation of undiscovered fields, while Pareto's law tends to overestimate the undiscovered subpopulation. Therefore the log-Pareto law is more appropriate for risk-averse decision makers, and Pareto law for risk-seeking decision makers. The log-Pareto law has been validated with both simulated data and actual data from Nigeria, the Denver-Julesburg Basin, and the Campos Basin in Brazil.

  2. LogCauchy, log-sech and lognormal distributions of species abundances in forest communities

    USGS Publications Warehouse

    Yin, Z.-Y.; Peng, S.-L.; Ren, H.; Guo, Q.; Chen, Z.-H.

    2005-01-01

    Species-abundance (SA) pattern is one of the most fundamental aspects of biological community structure, providing important information regarding species richness, species-area relation and succession. To better describe the SA distribution (SAD) in a community, based on the widely used lognormal (LN) distribution model with exp(-x2) roll-off on Preston's octave scale, this study proposed two additional models, logCauchy (LC) and log-sech (LS), respectively with roll-offs of simple x-2 and e-x. The estimation of the theoretical total number of species in the whole community, S*, including very rare species not yet collected in sample, was derived from the left-truncation of each distribution. We fitted these three models by Levenberg-Marquardt nonlinear regression and measured the model fit to the data using coefficient of determination of regression, parameters' t-test and distribution's Kolmogorov-Smirnov (KS) test. Examining the SA data from six forest communities (five in lower subtropics and one in tropics), we found that: (1) on a log scale, all three models that are bell-shaped and left-truncated statistically adequately fitted the observed SADs, and the LC and LS did better than the LN; (2) from each model and for each community the S* values estimated by the integral and summation methods were almost equal, allowing us to estimate S* using a simple integral formula and to estimate its asymptotic confidence internals by regression of a transformed model containing it; (3) following the order of LC, LS, and LN, the fitted distributions became lower in the peak, less concave in the side, and shorter in the tail, and overall the LC tended to overestimate, the LN tended to underestimate, while the LS was intermediate but slightly tended to underestimate, the observed SADs (particularly the number of common species in the right tail); (4) the six communities had some similar structural properties such as following similar distribution models, having a common

  3. Growing optimal scale-free networks via likelihood.

    PubMed

    Small, Michael; Li, Yingying; Stemler, Thomas; Judd, Kevin

    2015-04-01

    Preferential attachment, by which new nodes attach to existing nodes with probability proportional to the existing nodes' degree, has become the standard growth model for scale-free networks, where the asymptotic probability of a node having degree k is proportional to k^{-γ}. However, the motivation for this model is entirely ad hoc. We use exact likelihood arguments and show that the optimal way to build a scale-free network is to attach most new links to nodes of low degree. Curiously, this leads to a scale-free network with a single dominant hub: a starlike structure we call a superstar network. Asymptotically, the optimal strategy is to attach each new node to one of the nodes of degree k with probability proportional to 1/N+ζ(γ)(k+1)(γ) (in a N node network): a stronger bias toward high degree nodes than exhibited by standard preferential attachment. Our algorithm generates optimally scale-free networks (the superstar networks) as well as randomly sampling the space of all scale-free networks with a given degree exponent γ. We generate viable realization with finite N for 1≪γ<2 as well as γ>2. We observe an apparently discontinuous transition at γ≈2 between so-called superstar networks and more treelike realizations. Gradually increasing γ further leads to reemergence of a superstar hub. To quantify these structural features, we derive a new analytic expression for the expected degree exponent of a pure preferential attachment process and introduce alternative measures of network entropy. Our approach is generic and can also be applied to an arbitrary degree distribution.

  4. Growing optimal scale-free networks via likelihood

    NASA Astrophysics Data System (ADS)

    Small, Michael; Li, Yingying; Stemler, Thomas; Judd, Kevin

    2015-04-01

    Preferential attachment, by which new nodes attach to existing nodes with probability proportional to the existing nodes' degree, has become the standard growth model for scale-free networks, where the asymptotic probability of a node having degree k is proportional to k-γ. However, the motivation for this model is entirely ad hoc. We use exact likelihood arguments and show that the optimal way to build a scale-free network is to attach most new links to nodes of low degree. Curiously, this leads to a scale-free network with a single dominant hub: a starlike structure we call a superstar network. Asymptotically, the optimal strategy is to attach each new node to one of the nodes of degree k with probability proportional to 1/N +ζ (γ ) (k+1 ) γ (in a N node network): a stronger bias toward high degree nodes than exhibited by standard preferential attachment. Our algorithm generates optimally scale-free networks (the superstar networks) as well as randomly sampling the space of all scale-free networks with a given degree exponent γ . We generate viable realization with finite N for 1 ≪γ <2 as well as γ >2 . We observe an apparently discontinuous transition at γ ≈2 between so-called superstar networks and more treelike realizations. Gradually increasing γ further leads to reemergence of a superstar hub. To quantify these structural features, we derive a new analytic expression for the expected degree exponent of a pure preferential attachment process and introduce alternative measures of network entropy. Our approach is generic and can also be applied to an arbitrary degree distribution.

  5. Fluid-Rock Characterization and Interactions in NMR Well Logging

    SciTech Connect

    George J. Hirasaki; Kishore K. Mohanty

    2005-09-05

    The objective of this report is to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well logging. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well logging. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity. Oil based drilling fluids can have an adverse effect on NMR well logging if it alters the wettability of the formation. The effect of various surfactants on wettability and surface relaxivity are evaluated for silica sand. The relation between the relaxation time and diffusivity distinguishes the response of brine, oil, and gas in a NMR well log. A new NMR pulse sequence in the presence of a field gradient and a new inversion technique enables the T{sub 2} and diffusivity distributions to be displayed as a two-dimensional map. The objectives of pore morphology and rock characterization are to identify vug connectivity by using X-ray CT scan, and to improve NMR permeability correlation. Improved estimation of permeability from NMR response is possible by using estimated tortuosity as a parameter to interpolate between two existing permeability models.

  6. A small-diameter NMR logging tool for groundwater investigations.

    PubMed

    Walsh, David; Turner, Peter; Grunewald, Elliot; Zhang, Hong; Butler, James J; Reboulet, Ed; Knobbe, Steve; Christy, Tom; Lane, John W; Johnson, Carole D; Munday, Tim; Fitzpatrick, Andrew

    2013-01-01

    A small-diameter nuclear magnetic resonance (NMR) logging tool has been developed and field tested at various sites in the United States and Australia. A novel design approach has produced relatively inexpensive, small-diameter probes that can be run in open or PVC-cased boreholes as small as 2 inches in diameter. The complete system, including surface electronics and various downhole probes, has been successfully tested in small-diameter monitoring wells in a range of hydrogeological settings. A variant of the probe that can be deployed by a direct-push machine has also been developed and tested in the field. The new NMR logging tool provides reliable, direct, and high-resolution information that is of importance for groundwater studies. Specifically, the technology provides direct measurement of total water content (total porosity in the saturated zone or moisture content in the unsaturated zone), and estimates of relative pore-size distribution (bound vs. mobile water content) and hydraulic conductivity. The NMR measurements show good agreement with ancillary data from lithologic logs, geophysical logs, and hydrogeologic measurements, and provide valuable information for groundwater investigations. PMID:23425428

  7. The Atacama Cosmology Telescope: Likelihood for Small-Scale CMB Data

    NASA Technical Reports Server (NTRS)

    Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G. E.; Battaglia, N.; Battistelli, E. S.; Bond, J. R.; Das, S.; Devlin, M. J.; Dunner, R.; Fowler, J. W.; Gralla, M.; Hajian, A.; Halpern, M.; Hasselfield, M.; Hincks, A. D.; Hlozek, R.; Hughes, J. P.; Irwin, K. D.; Kosowsky, A.; Louis, T.; Marriage, T. A.; Marsden, D.; Menanteau, F.; Niemack, M.

    2013-01-01

    The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a likelihood function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the likelihood to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with ?2/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency likelihood to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' likelihood in the range 500 < l < 3500 for use in cosmological parameter estimation

  8. The Atacama Cosmology Telescope: likelihood for small-scale CMB data

    SciTech Connect

    Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G.E.; Halpern, M.; Battaglia, N.; Battistelli, E.S.; Bond, J.R.; Hajian, A.; Hincks, A.D.; Das, S.; Devlin, M.J.; Dünner, R.; Fowler, J.W.; Irwin, K.D.; Gralla, M.; Hasselfield, M.; Hlozek, R.; Hughes, J.P.; Kosowsky, A.; and others

    2013-07-01

    The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a likelihood function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the likelihood to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with χ{sup 2}/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency likelihood to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' likelihood in the range 500 < l < 3500 for use in cosmological parameter estimation.

  9. Learned predictions of error likelihood in the anterior cingulate cortex.

    PubMed

    Brown, Joshua W; Braver, Todd S

    2005-02-18

    The anterior cingulate cortex (ACC) and the related medial wall play a critical role in recruiting cognitive control. Although ACC exhibits selective error and conflict responses, it has been unclear how these develop and become context-specific. With use of a modified stop-signal task, we show from integrated computational neural modeling and neuroimaging studies that ACC learns to predict error likelihood in a given context, even for trials in which there is no error or response conflict. These results support a more general error-likelihood theory of ACC function based on reinforcement learning, of which conflict and error detection are special cases.

  10. Approximated maximum likelihood estimation in multifractal random walks

    NASA Astrophysics Data System (ADS)

    Løvsletten, O.; Rypdal, M.

    2012-04-01

    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry , Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.64.026103 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the r computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.

  11. Nonparametric maximum likelihood estimation for the multisample Wicksell corpuscle problem

    PubMed Central

    Chan, Kwun Chuen Gary; Qin, Jing

    2016-01-01

    We study nonparametric maximum likelihood estimation for the distribution of spherical radii using samples containing a mixture of one-dimensional, two-dimensional biased and three-dimensional unbiased observations. Since direct maximization of the likelihood function is intractable, we propose an expectation-maximization algorithm for implementing the estimator, which handles an indirect measurement problem and a sampling bias problem separately in the E- and M-steps, and circumvents the need to solve an Abel-type integral equation, which creates numerical instability in the one-sample problem. Extensions to ellipsoids are studied and connections to multiplicative censoring are discussed. PMID:27279657

  12. Logjam: A scalable unified log file archiver

    SciTech Connect

    Cardo, Nicholas P.

    2001-08-01

    Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. For computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.

  13. Logjam: A scalable unified log file archiver

    2001-08-01

    Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. Formore » computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.« less

  14. 76 FR 42130 - Agency Information Collection Activities: BioWatch Filter Holder Log

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-18

    ... collection request (ICR) in the Federal Register on May 2, 2011 at 76 FR 24504, for a 60-day public comment... SECURITY Agency Information Collection Activities: BioWatch Filter Holder Log AGENCY: Department of... sample analysis. A standard filter log form is completed for each sample and is archived by the...

  15. 16 CFR Figure 9 to Subpart A of... - Flux Profile Data Log Format

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Flux Profile Data Log Format 9 Figure 9 to Subpart A of Part 1209 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT... to Subpart A of Part 1209—Flux Profile Data Log Format EC03OC91.039...

  16. Nuclear Well Log Properties of Natural Gas Hydrate Reservoirs

    NASA Astrophysics Data System (ADS)

    Burchwell, A.; Cook, A.

    2015-12-01

    Characterizing gas hydrate in a reservoir typically involves a full suite of geophysical well logs. The most common method involves using resistivity measurements to quantify the decrease in electrically conductive water when replaced with gas hydrate. Compressional velocity measurements are also used because the gas hydrate significantly strengthens the moduli of the sediment. At many gas hydrate sites, nuclear well logs, which include the photoelectric effect, formation sigma, carbon/oxygen ratio and neutron porosity, are also collected but often not used. In fact, the nuclear response of a gas hydrate reservoir is not known. In this research we will focus on the nuclear log response in gas hydrate reservoirs at the Mallik Field at the Mackenzie Delta, Northwest Territories, Canada, and the Gas Hydrate Joint Industry Project Leg 2 sites in the northern Gulf of Mexico. Nuclear logs may add increased robustness to the investigation into the properties of gas hydrates and some types of logs may offer an opportunity to distinguish between gas hydrate and permafrost. For example, a true formation sigma log measures the thermal neutron capture cross section of a formation and pore constituents; it is especially sensitive to hydrogen and chlorine in the pore space. Chlorine has a high absorption potential, and is used to determine the amount of saline water within pore spaces. Gas hydrate offers a difference in elemental composition compared to water-saturated intervals. Thus, in permafrost areas, the carbon/oxygen ratio may vary between gas hydrate and permafrost, due to the increase of carbon in gas hydrate accumulations. At the Mallik site, we observe a hydrate-bearing sand (1085-1107 m) above a water-bearing sand (1107-1140 m), which was confirmed through core samples and mud gas analysis. We observe a decrease in the photoelectric absorption of ~0.5 barnes/e-, as well as an increase in the formation sigma readings of ~5 capture units in the water-bearing sand as

  17. Carbon emissions from tropical forest degradation caused by logging

    NASA Astrophysics Data System (ADS)

    Pearson, Timothy R. H.; Brown, Sandra; Casarim, Felipe M.

    2014-03-01

    The focus of land-use related efforts in developing countries to reduce carbon emissions has been on slowing deforestation, yet international agreements are to reduce emissions from both deforestation and forest degradation (REDD). The second ‘D’ is poorly understood and accounted for a number of technical and policy reasons. Here we introduce a complete accounting method for estimating emission factors from selective timber harvesting, a substantial form of forest degradation in many tropical developing countries. The method accounts separately for emissions from the extracted log, from incidental damage to the surrounding forest, and from logging infrastructure, and emissions are expressed as units of carbon per cubic meter of timber extracted to allow for simple application to timber harvesting statistics. We applied the method in six tropical countries (Belize, Bolivia, Brazil, Guyana, Indonesia, and Republic of Congo), resulting in total emission factors of 0.99-2.33 Mg C m-3. In all cases, emissions were dominated by damage to surrounding vegetation and the infrastructure rather than the logs themselves, and total emissions represented about 3-15% of the biomass carbon stocks of the associated unlogged forests. We then combined the emission factors with country level logging statistics for nine key timber producing countries represented by our study areas to gain an understanding of the order of magnitude of emissions from degradation compared to those recently reported for deforestation in the same countries. For the nine countries included, emissions from logging were on average equivalent to about 12% of those from deforestation. For those nine countries with relatively low emissions from deforestation, emissions from logging were equivalent to half or more of those from deforestation, whereas for those countries with the highest emissions from deforestation, emissions from logging were equivalent to <10% of those from deforestation. Understanding how

  18. Unconventional neutron sources for oil well logging

    NASA Astrophysics Data System (ADS)

    Frankle, C. M.; Dale, G. E.

    2013-09-01

    Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.

  19. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  20. Log-rolling block copolymers cylinders

    NASA Astrophysics Data System (ADS)

    Kim, So Youn; Kim, Ye Chan; Kim, Dong Hyup; Kwon, Na Kyung; Register, Richard A.

    Shear has been the most effective method to create long range order of micro- or nano- structures in soft materials. When shear is applied, soft particles or polymers tend to align along the shear direction to minimize the viscous dissipation, thus transverse (so-called ``log-rolling'') alignment is unfavored. In this study, for the first time we report the transverse alignment of cylinder-forming block copolymers. Poly(styrene-b-methyl methacrylate), PS-PMMA, can form a metastable hemicylinder structure when confined in a thin film, and this hemicylinder structure can align either along the shear direction, or transverse to the shear direction (``log-rolling''), depending on the shearing temperature. This unusual ``log-rolling'' behavior is explained by the different chain mobility of the two blocks in PS-PMMA; the rigidity of core cylinder is the critical parameter determining the direction of shear alignment.

  1. Relationships between log N-log S and celestial distribution of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Nishimura, J.; Yamagami, T.

    1985-01-01

    The apparent conflict between log N-log S curve and isotropic celestial distribution of the gamma ray bursts is discussed. A possible selection effect due to the time profile of each burst is examined. It is shown that the contradiction is due to this selection effect of the gamma ray bursts.

  2. LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.

    ERIC Educational Resources Information Center

    Honerman, James

    1999-01-01

    Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…

  3. Lithologic logs and geophysical logs from test drilling in Palm Beach County, Florida, since 1974

    USGS Publications Warehouse

    Swayze, Leo J.; McGovern, Michael C.; Fischer, John N.

    1980-01-01

    Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic logs from 46 test wells and geophysical logs from 40 test wells are contained in this report. (USGS)

  4. Maximum-likelihood estimation of admixture proportions from genetic data.

    PubMed Central

    Wang, Jinliang

    2003-01-01

    For an admixed population, an important question is how much genetic contribution comes from each parental population. Several methods have been developed to estimate such admixture proportions, using data on genetic markers sampled from parental and admixed populations. In this study, I propose a likelihood method to estimate jointly the admixture proportions, the genetic drift that occurred to the admixed population and each parental population during the period between the hybridization and sampling events, and the genetic drift in each ancestral population within the interval between their split and hybridization. The results from extensive simulations using various combinations of relevant parameter values show that in general much more accurate and precise estimates of admixture proportions are obtained from the likelihood method than from previous methods. The likelihood method also yields reasonable estimates of genetic drift that occurred to each population, which translate into relative effective sizes (N(e)) or absolute average N(e)'s if the times when the relevant events (such as population split, admixture, and sampling) occurred are known. The proposed likelihood method also has features such as relatively low computational requirement compared with previous ones, flexibility for admixture models, and marker types. In particular, it allows for missing data from a contributing parental population. The method is applied to a human data set and a wolflike canids data set, and the results obtained are discussed in comparison with those from other estimators and from previous studies. PMID:12807794

  5. Likelihood-Free Inference in High-Dimensional Models.

    PubMed

    Kousathanas, Athanasios; Leuenberger, Christoph; Helfer, Jonas; Quinodoz, Mathieu; Foll, Matthieu; Wegmann, Daniel

    2016-06-01

    Methods that bypass analytical evaluations of the likelihood function have become an indispensable tool for statistical inference in many fields of science. These so-called likelihood-free methods rely on accepting and rejecting simulations based on summary statistics, which limits them to low-dimensional models for which the value of the likelihood is large enough to result in manageable acceptance rates. To get around these issues, we introduce a novel, likelihood-free Markov chain Monte Carlo (MCMC) method combining two key innovations: updating only one parameter per iteration and accepting or rejecting this update based on subsets of statistics approximately sufficient for this parameter. This increases acceptance rates dramatically, rendering this approach suitable even for models of very high dimensionality. We further derive that for linear models, a one-dimensional combination of statistics per parameter is sufficient and can be found empirically with simulations. Finally, we demonstrate that our method readily scales to models of very high dimensionality, using toy models as well as by jointly inferring the effective population size, the distribution of fitness effects (DFE) of segregating mutations, and selection coefficients for each locus from data of a recent experiment on the evolution of drug resistance in influenza. PMID:27052569

  6. Effective Family Position and Likelihood of Becoming an Alcoholic.

    ERIC Educational Resources Information Center

    Majumdar, Mahbubon N.; Bhatia, Pritam S.

    1980-01-01

    Discusses effective family position as a new variable developed to measure the effect of birth order and childhood home environment on the likelihood of becoming an alcoholic. Constructs of fixation and regression may also be helpful in differentiating two types of alcoholism. (JAC)

  7. Likelihood-Free Inference in High-Dimensional Models.

    PubMed

    Kousathanas, Athanasios; Leuenberger, Christoph; Helfer, Jonas; Quinodoz, Mathieu; Foll, Matthieu; Wegmann, Daniel

    2016-06-01

    Methods that bypass analytical evaluations of the likelihood function have become an indispensable tool for statistical inference in many fields of science. These so-called likelihood-free methods rely on accepting and rejecting simulations based on summary statistics, which limits them to low-dimensional models for which the value of the likelihood is large enough to result in manageable acceptance rates. To get around these issues, we introduce a novel, likelihood-free Markov chain Monte Carlo (MCMC) method combining two key innovations: updating only one parameter per iteration and accepting or rejecting this update based on subsets of statistics approximately sufficient for this parameter. This increases acceptance rates dramatically, rendering this approach suitable even for models of very high dimensionality. We further derive that for linear models, a one-dimensional combination of statistics per parameter is sufficient and can be found empirically with simulations. Finally, we demonstrate that our method readily scales to models of very high dimensionality, using toy models as well as by jointly inferring the effective population size, the distribution of fitness effects (DFE) of segregating mutations, and selection coefficients for each locus from data of a recent experiment on the evolution of drug resistance in influenza.

  8. Troubled Conception: Negotiating the Likelihood of Having Children

    ERIC Educational Resources Information Center

    May, Marian

    2007-01-01

    In the context of low fertility and Australia's ageing population, a national longitudinal telephone survey, "Negotiating the Life Course" (NLC), asks women about their childbearing intentions. This paper uses conversation analysis (CA) to examine interaction between an interviewer and respondents on one NLC question about the likelihood of having…

  9. Likelihood Methods for Adaptive Filtering and Smoothing. Technical Report #455.

    ERIC Educational Resources Information Center

    Butler, Ronald W.

    The dynamic linear model or Kalman filtering model provides a useful methodology for predicting the past, present, and future states of a dynamic system, such as an object in motion or an economic or social indicator that is changing systematically with time. Recursive likelihood methods for adaptive Kalman filtering and smoothing are developed.…

  10. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  11. Mixture Rasch Models with Joint Maximum Likelihood Estimation

    ERIC Educational Resources Information Center

    Willse, John T.

    2011-01-01

    This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…

  12. Permeability extraction: A sonic log inversion

    SciTech Connect

    Akbar, N.; Kim, J.J.

    1994-12-31

    In this paper the authors provide the missing important link between permeability and acoustic velocities by generating a permeability-dependent synthetic sonic log in a carbonate reservoir. The computations are based on Akbar`s theory that relates wave velocity to frequency, rock properties (e.g., lithology, permeability, and porosity), and fluid saturation and properties (viscosity, density, and compressibility). An inverted analytical expression of the theory is used to extract permeability from sonic velocity. The synthetic sonic and the computed permeability are compared with the observed sonic log and with plug permeability, respectively. The results demonstrate, as predicted by theory, that permeability can be related directly to acoustic velocities.

  13. Development of pulsed neutron uranium logging instrument

    SciTech Connect

    Wang, Xin-guang; Liu, Dan; Zhang, Feng

    2015-03-15

    This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.

  14. Compacting a Kentucky coal for quality logs

    SciTech Connect

    Lin, Y.; Li, Z.; Mao, S.

    1999-07-01

    A Kentucky coal was found more difficult to be compacted into large size strong logs. Study showed that compaction parameters affecting the strength of compacted coal logs could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.

  15. Targeted Maximum Likelihood Based Causal Inference: Part I

    PubMed Central

    van der Laan, Mark J.

    2010-01-01

    Given causal graph assumptions, intervention-specific counterfactual distributions of the data can be defined by the so called G-computation formula, which is obtained by carrying out these interventions on the likelihood of the data factorized according to the causal graph. The obtained G-computation formula represents the counterfactual distribution the data would have had if this intervention would have been enforced on the system generating the data. A causal effect of interest can now be defined as some difference between these counterfactual distributions indexed by different interventions. For example, the interventions can represent static treatment regimens or individualized treatment rules that assign treatment in response to time-dependent covariates, and the causal effects could be defined in terms of features of the mean of the treatment-regimen specific counterfactual outcome of interest as a function of the corresponding treatment regimens. Such features could be defined nonparametrically in terms of so called (nonparametric) marginal structural models for static or individualized treatment rules, whose parameters can be thought of as (smooth) summary measures of differences between the treatment regimen specific counterfactual distributions. In this article, we develop a particular targeted maximum likelihood estimator of causal effects of multiple time point interventions. This involves the use of loss-based super-learning to obtain an initial estimate of the unknown factors of the G-computation formula, and subsequently, applying a target-parameter specific optimal fluctuation function (least favorable parametric submodel) to each estimated factor, estimating the fluctuation parameter(s) with maximum likelihood estimation, and iterating this updating step of the initial factor till convergence. This iterative targeted maximum likelihood updating step makes the resulting estimator of the causal effect double robust in the sense that it is

  16. Profile likelihood maps of a 15-dimensional MSSM

    NASA Astrophysics Data System (ADS)

    Strege, C.; Bertone, G.; Besjes, G. J.; Caron, S.; Ruiz de Austri, R.; Strubig, A.; Trotta, R.

    2014-09-01

    We present statistically convergent profile likelihood maps obtained via global fits of a phenomenological Minimal Supersymmetric Standard Model with 15 free parameters (the MSSM-15), based on over 250M points. We derive constraints on the model parameters from direct detection limits on dark matter, the Planck relic density measurement and data from accelerator searches. We provide a detailed analysis of the rich phenomenology of this model, and determine the SUSY mass spectrum and dark matter properties that are preferred by current experimental constraints. We evaluate the impact of the measurement of the anomalous magnetic moment of the muon ( g - 2) on our results, and provide an analysis of scenarios in which the lightest neutralino is a subdominant component of the dark matter. The MSSM-15 parameters are relatively weakly constrained by current data sets, with the exception of the parameters related to dark matter phenomenology ( M 1, M 2, μ), which are restricted to the sub-TeV regime, mainly due to the relic density constraint. The mass of the lightest neutralino is found to be < 1.5 TeV at 99% C.L., but can extend up to 3 TeV when excluding the g - 2 constraint from the analysis. Low-mass bino-like neutralinos are strongly favoured, with spin-independent scattering cross-sections extending to very small values, ~ 10-20 pb. ATLAS SUSY null searches strongly impact on this mass range, and thus rule out a region of parameter space that is outside the reach of any current or future direct detection experiment. The best-fit point obtained after inclusion of all data corresponds to a squark mass of 2.3 TeV, a gluino mass of 2.1 TeV and a 130 GeV neutralino with a spin-independent cross-section of 2.4 × 10-10 pb, which is within the reach of future multi-ton scale direct detection experiments and of the upcoming LHC run at increased centre-of-mass energy.

  17. Drilling, logging and preliminary well testing of geothermal well Susan 1, Susanville, Lassen County, California

    SciTech Connect

    McNitt, J.R.; Petersen, C.A.; Sanyal, S.K.

    1981-03-01

    Susan 1, a hot water production well, was drilled late in 1980 for the City of Susanville, California, as part of its geothermal space-heating project. A history of drilling, logging, completion and pump testing of this well is presented. Susan 1 was drilled to 930 feet using local river water with a 17-1/2-inch bit from 50 to 540 feet and a 12-1/2-inch bit from 540 to 927 feet. A 12-3/4-inch solid casing was set from surface to 350 feet, a slotted casing from 350 to 538 feet, and a 8-5/8-inch slotted casing from 520 to 925 feet. Interpretations of the following logs and test data from this well are presented: drilling logs (penetration rate, water loss and gain, return temperatures); formation logs (description of well cuttings, caliper, spontaneous potential, electrical resistivity, gamma ray, neutron); production logs (temperature, spinner); and pump test data.

  18. Exclusion probabilities and likelihood ratios with applications to mixtures.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model. PMID:26160753

  19. Exclusion probabilities and likelihood ratios with applications to mixtures.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  20. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  1. Requirement of copper for 1st-log growth of the yeast Saccharomyces cerevisiae

    SciTech Connect

    Como, S.A.; Valerio, V.; Nickless, S.; Connelly, J.L.

    1986-05-01

    Routine evaluation of the role of copper (Cu) in the growth of various mutants of the yeast Saccharomyces Cerevisiae disclosed an unexpected effect of Cu on the fermentative first-log growth. The authors subsequent studies are attempting to ascertain the nature and significance of this observation. Cells are grown on glucose in a supplemented minimal media at 29/sup 0/C for 48-72 hrs. using New Brunswick incubator shaking at 200 rpm. Cu concentration was varied by addition of Cu salts or bathocuproine disulfonate (BC), a highly specific Cu chelator. Samples were removed periodically from flasks and dry weights were determined. Growth curve plots of normal yeasts grown in the presence of 1mM to 38mM Cu showed little variation in the expected 1st log; diauxi; 2nd log; stationary phase picture. However, in the presence of BC growth rate in the 1st log was significantly slowed and as expected 2nd log growth was essentially stopped. The low 1st log growth rate could be titrated to normal (+Cu) levels by increments of added Cu but not by added iron. The effect was not seen when Rho-minus strains were used nor when growth was followed under anaerobic conditions. Results to date implicate a mitochondrial protein, oxygen and copper in the 1st log growth of S Cerevisiae. The character of the protein agent and the possible contribution of cytochrome oxidase activity to the lst log growth are being evaluated.

  2. Performance and Improved Design of the Log Spiral of Revolution Monochromator

    SciTech Connect

    Pease, D. M.; Shanthakumar, P.; Huang, T.; Budnick, J. I.; Odong, O.; Frenkel, A. I.; Abitbol, N.; Balasubramanian, M.; Brewer, D.

    2007-02-02

    We show that, whereas the present log spiral of revolution monochromator works well for Cr edges of 2.8% Cr2O3 in a V2O3 matrix, the device transmits noticeable V extended structure in the case of 0.365% Cr2O3. We demonstrate that the transmitted V extended structure is due to the V K{beta} line which is unresolved by the monochromator. It is suggested that this limitation may be overcome by designing a log spiral detector for the Cr K{beta} line rather than the Cr K{alpha} line. Aspects of the design of this modified log spiral are discussed.

  3. Determination of log P values of new cyclen based antimalarial drug leads using RP-HPLC.

    PubMed

    Rudraraju, A V; Amoyaw, P N A; Hubin, T J; Khan, M O F

    2014-09-01

    Lipophilicity, expressed by log P, is an important physicochemical property of drugs that affects many biological processes, including drug absorption and distribution. The main purpose of this study to determine the log P values of newly discovered drug leads using reversed-phase high-performance liquid chromatography (RP-HPLC). The reference standards, with varying polarity ranges, were dissolved in methanol and analyzed by RP-HPLC using a C18 column. The mobile phase consisted of a mixture of acetonitrile, methanol and water in a gradient elution mode. A calibration curve was plotted between the experimental log P values and obtained log k values of the reference standard compounds and a best fit line was obtained. The log k values of the new drug leads were determined in the same solvent system and were used to calculate the respective log P values by using the best fit equation. The log P vs. log k data gave a best fit linear curve that had an R2 of 0.9786 with Pvalues of the intercept and slope of 1.19 x 10(-6) and 1.56 x 10(-10), respectively, at 0.05 level of significance. Log P values of 15 new drug leads and related compounds, all of which are derivatives of macrocyclic polyamines and their metal complexes, were determined. The values obtained are closely related to the calculated log P (Clog P) values using ChemDraw Ultra 12.0. This experiment provided efficient, fast and reasonable estimates of log P values of the new drug leads by using RP-HPLC.

  4. Selective logging: does the imprint remain on tree structure and composition after 45 years?

    PubMed Central

    Osazuwa-Peters, Oyomoare L.; Chapman, Colin A.; Zanne, Amy E.

    2015-01-01

    Selective logging of tropical forests is increasing in extent and intensity. The duration over which impacts of selective logging persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective logging event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.2 ha area in Kibale National Park, Uganda, in logged and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively logged forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively logged forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of logging history may be due to functional recovery of logged forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective logging on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in logged forest compared with unlogged forest. The effects of selective logging are exerted via influences on tree demography rather than functional trait composition. PMID:27293697

  5. Prognosis of patients with esophageal squamous cell carcinoma after esophagectomy using the log odds of positive lymph nodes.

    PubMed

    Wu, San-Gang; Sun, Jia-Yuan; Yang, Li-Chao; Zhou, Juan; Li, Feng-Yan; Li, Qun; Lin, Huan-Xin; Lin, Qin; He, Zhen-Yu

    2015-11-01

    To compare the log odds of positive lymph nodes (LODDS) with the number of positive lymph nodes (pN), lymph node ratio (LNR), removed lymph node (RLN) count, and negative lymph node (NLN) count in determining the prognosis of patients with esophageal squamous cell carcinoma (ESCC) after esophagectomy. The records of patients with ESCC who received esophagectomy were retrospectively reviewed. The log-rank test was used to compare curves for overall survival (OS), and Cox regression analysis was performed to identify prognostic factors. The prognostic performance of the different lymph node staging systems were compared using the linear trend chi-square test, likelihood ratio chi-square test, and Akaike information criterion. A total of 589 patients were enrolled. Univariate Cox analysis showed that pN stage, LNR, RLN count, NLN count, and the LODDS were significantly associated with OS (p < 0.05 for all). Multivariate Cox analysis adjusted for significant factors indicated that LODDS was independent risk factor on overall survival (OS), and a higher LODDS was associated with worse OS (hazard ratio = 3.297, 95% confidence interval: 2.684-4.050, p < 0.001). The modified Tumor-LODDS-Metastasis staging system had better discriminatory ability, monotonicity, and homogeneity, and better optimistic prognostic stratification than the Tumor-Node-Metastasis staging system in determining the prognosis of patients with ESCC. The LODDS staging system was superior to other lymph node classifications in determining the prognosis of patients with ESCC after esophagectomy. LODDS may be incorporated into esophageal staging system if these results are eventually confirmed by other studies.

  6. Discovering the Local Landscape: Pioneer Log Buildings.

    ERIC Educational Resources Information Center

    Douglas, Bob; And Others

    Building structures made from logs appeared in the eastern United States during the late 17th century, and immigrants from Sweden, Finland, and Germany are credited with their construction. There were two types of structures: the horizontal design introduced by the Scandinavians and the German or Pennsylvania Dutch model that was used by the…

  7. [Human development and log-periodic law].

    PubMed

    Cash, Roland; Chaline, Jean; Nottale, Laurent; Grou, Pierre

    2002-05-01

    We suggest applying the log-periodic law formerly used to describe various crisis phenomena, in biology (evolutionary leaps), inorganic systems (earthquakes), societies and economy (economic crisis, market crashes) to the various steps of human ontogeny. We find a statistically significant agreement between this model and the data.

  8. Modelling tropical forests response to logging

    NASA Astrophysics Data System (ADS)

    Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco

    2013-04-01

    Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.

  9. There's Life in Those Dead Logs!

    ERIC Educational Resources Information Center

    Biggs, Devin; Miller, Todd; Hall, Dee

    2006-01-01

    Although it is unspectacular in appearance, dead wood is one of the most ecologically important resources in forests. Fallen logs, dead standing trees, stumps, and even cavities in live trees fulfill a wide range of roles. Prominent among these is that they provide habitat for many organisms, especially insects. Fourth-grade students at Fox…

  10. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... statement of any corrective action taken. (5) Entries must be made giving details of all work performed... written log signed by the operator who supervised or performed the work and, unless the operator is... made comparing the radio station clock with standard time, including errors observed and...

  11. 47 CFR 80.409 - Station logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... statement of any corrective action taken. (5) Entries must be made giving details of all work performed... written log signed by the operator who supervised or performed the work and, unless the operator is... made comparing the radio station clock with standard time, including errors observed and...

  12. Dissociation behavior of Methane Hydrate presumed by NMR log analysis

    NASA Astrophysics Data System (ADS)

    Kotera, T.

    2015-12-01

    This study has been conducted with the aim of comprehending dissociation behavior of MH. The production test was operated in the Daini-Atsumi knoll in the eastern Nankai Trough, Japan. We corresponded the NMR log data acquired from the three wells, which drilled before the test (AT1-MC) and after the test (AT1-LWD1 and AT1-LWD2). NMR log measures T2 relaxation time, and calculates porosity and permeability. We especially focused on the T2 distribution. It is assumed that MH occupied larger pore space in the sandy sediment because the T2 distribution in the MH bearing layer has no peak in the longer time zone at the AT1-MC. However, T2 peak appeared over 33ms zone at the AT1-LWD1 and AT1-LWD2. This phenomenon is observed in the thin MH bearing layers rather than thick one. On the other hand, nothing T2 peak appeared over the 33ms zone in the thick MH bearing layer, but T2 distribution shifts to longer relaxation time in the short time interval. Hence, it is assumed that the MH was dissociated from the contact faces with the grain. In terms of the thermal conductivity, near the grain-grain contact faces are more dissociable than the MH-grain contact; however both of dissociation zones are essentially MH-grain contact faces. Nothing or few MH was observed in the muddy layer at the coring campaign near these wells. Abovementioned, NMR logging detected various changes on the T2 distribution. It seems to indicate the dissociation of MH. And these data gets into alignment with other log data and monitoring data, which are resistivity and temperature measurement. However, as this logging data acquired from each location, there is possibility that the initial condition was originally distinct. This research was conducted as a part of the MH21 research, and the authors would like to express their sincere appreciation to MH21 and the Ministry of Economy, Trade and Industry for providing the permission to disclose this research.

  13. 47 CFR 73.782 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.782 Section 73.782... International Broadcast Stations § 73.782 Retention of logs. Logs of international broadcast stations shall be retained by the licensee or permittee for a period of two years: Provided, however, That logs...

  14. 32 CFR 700.846 - Status of logs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Status of logs. 700.846 Section 700.846 National Defense Department of Defense (Continued) DEPARTMENT OF THE NAVY UNITED STATES NAVY REGULATIONS AND... Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...

  15. 47 CFR 73.877 - Station logs for LPFM stations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Station logs for LPFM stations. 73.877 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.877 Station logs for LPFM stations. The licensee of each LPFM station must maintain a station log. Each log entry must include the time and date...

  16. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of navigation, parties engaged in handling logs upon the river shall have the right to sluice, drive, and float logs in such manner as may best suit their convenience: Provided, A sufficient channel is... force of men must accompany each log drive to prevent the formation of log jams and to maintain an...

  17. 1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING CEDAR LOGS FROM TRUCKS AT LOG DUMP, ADJACENT TO MILL; TRUCKS FORMERLY USED TRIP STAKES, THOUGH FOR SAFER HANDLING OF LOGS WELDED STAKES ARE NOW REQUIRED; AS A RESULT LOADING IS NOW DONE WITH A CRANE - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR

  18. Using the jackknife for estimation in log link Bernoulli regression models.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Arriaga, Alex; Sinha, Debajyoti; Gawande, Atul A

    2015-02-10

    Bernoulli (or binomial) regression using a generalized linear model with a log link function, where the exponentiated regression parameters have interpretation as relative risks, is often more appropriate than logistic regression for prospective studies with common outcomes. In particular, many researchers regard relative risks to be more intuitively interpretable than odds ratios. However, for the log link, when the outcome is very prevalent, the likelihood may not have a unique maximum. To circumvent this problem, a 'COPY method' has been proposed, which is equivalent to creating for each subject an additional observation with the same covariates except the response variable has the outcome values interchanged (1's changed to 0's and 0's changed to 1's). The original response is given weight close to 1, while the new observation is given a positive weight close to 0; this approach always leads to convergence of the maximum likelihood algorithm, except for problems with convergence due to multicollinearity among covariates. Even though this method produces a unique maximum, when the outcome is very prevalent, and/or the sample size is relatively small, the COPY method can yield biased estimates. Here, we propose using the jackknife as a bias-reduction approach for the COPY method. The proposed method is motivated by a study of patients undergoing colorectal cancer surgery.

  19. Preliminary report on geophysical well-logging activity on the Salton Sea Scientific Drilling Project, Imperial Valley, California

    USGS Publications Warehouse

    Paillet, Frederick L.; Morin, R.H.; Hodges, H.E.

    1986-01-01

    The Salton Sea Scientific Drilling Project has culminated in a 10,564-ft deep test well, State 2-14 well, in the Imperial Valley of southern California. A comprehensive scientific program of drilling, coring, and downhole measurements, which was conducted for about 5 months, has obtained much scientific information concerning the physical and chemical processes associated with an active hydrothermal system. This report primarily focuses on the geophysical logging activities at the State 2-14 well and provides early dissemination of geophysical data to other investigators working on complementary studies. Geophysical-log data were obtained by a commercial logging company and by the U.S. Geological Survey (USGS). Most of the commercial logs were obtained during three visits to the site; only one commercial log was obtained below a depth of 6,000 ft. The commercial logs obtained were dual induction, natural gamma, compensated neutron formation density, caliper and sonic. The USGS logging effort consisted of four primary periods, with many logs extending below a depth of 6,000 ft. The USGS logs obtained were temperature, caliper, natural gamma, gamma spectral, epithermal neutron, acoustic velocity, full-waveform, and acoustic televiewer. Various problems occurred throughout the drilling phase of the Salton Sea Scientific Drilling Project that made successful logging difficult: (1) borehole constrictions, possibly resulting from mud coagulation, (2) maximum temperatures of about 300 C, and (3) borehole conditions unfavorable for logging because of numerous zones of fluid loss, cement plugs, and damage caused by repeated trips in and out of the hole. These factors hampered and compromised logging quality at several open-hole intervals. The quality of the logs was dependent on the degree of probe sophistication and sensitivity to borehole-wall conditions. Digitized logs presented were processed on site and are presented in increments of 1,000 ft. A summary of the numerous

  20. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  1. Improved production log interpretation in horizontal wells using pulsed neutron logs

    SciTech Connect

    Brady, J.L.; Kohring, J.J.; North, R.J.

    1996-12-31

    Production log flow profiles provide a valuable tool to evaluate well and reservoir performance. Horizontal wellbores and their associated completion designs present several challenges to profile interpretation for conventional production logging sensors and techniques. A unique approach combining pulsed neutron capture (PNC) log data with conventional production logging measurements is providing improved flow profile answers in slotted liner, horizontal well completions on the North Slope of Alaska. Identifying and eliminating undesirable gas production is one of the chief goals of production logging on the North Slope. This process becomes difficult in horizontal wellbores as fluid segregation affects the area investigated by the various logging sensors and also the velocities of the individual phases. Typical slotted liner completions further complicate analysis as fluids are able to flow in the liner/openhole annulus. Analysis of PNC log data provides two good qualitative indicators of formation permeability. The first technique is derived from the difference of the formation sigma response before and after injecting a high-capture cross-section borax solution. The second technique uses the difference of the formation sigma response and the formation porosity measured while injecting the formation with crude or seawater. Further analysis of PNC log runs show that the two techniques closely correlate with production flow profiles under solution gas-oil ratio (GOR) conditions. These two techniques in combination with conventional production logging measurements of temperature, capacitance, pressure, and spinner improve flow profile results. PNC results can be combined with temperature and pressure data in the absence of valid spinner data to provide an approximate flow profile. These techniques have been used to successfully determine profiles in both cemented and slotted liner completions with GORs in excess of 15,000 scf/bbl.

  2. A distributed design for monitoring, logging, and replaying device readings at LAMPF

    SciTech Connect

    Burns, M.

    1991-01-01

    As control of the Los Alamos Meson Physics linear accelerator and Proton Storage Ring moves to a more distributed system, it has been necessary to redesign the software which monitors, logs, and replays device readings throughout the facility. The new design allows devices to be monitored and their readings logged locally on a network of computers. Control of the monitoring and logging process is available throughout the network from user interfaces which communicate via remote procedure calls with server processes running on each node which monitors and records device readings. Similarly, the logged data can be replayed from anywhere on the network. Two major requirements influencing the final design were the need to reduce the load on the CPU of the control machines, and the need for much faster replay of the logged device readings. 1 ref., 2 figs.

  3. Coal-log pipeline system development. Fifth quarterly report, August 25, 1991--November 25, 1991

    SciTech Connect

    Liu, H.

    1991-12-01

    Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)

  4. Predicting hospital visits from geo-tagged Internet search logs.

    PubMed

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user's future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources-a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641

  5. Predicting hospital visits from geo-tagged Internet search logs.

    PubMed

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user's future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources-a crucial prerequisite for securing healthcare access for everyone in the days to come.

  6. Predicting hospital visits from geo-tagged Internet search logs

    PubMed Central

    Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H.

    2016-01-01

    The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user’s future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources—a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641

  7. Ergonomic initiatives for machine operators by the Swedish logging industry.

    PubMed

    Synwoldt, Uwe; Gellerstedt, Sten

    2003-03-01

    In 1994, the Swedish Work Environment Authority (SWEA) considered to regulate the amount of working hours in a logging machine in order to force an increased use of job rotation. Occupational neck and shoulder disorders had been threatening machine operators' health ever since the late 1970s. Representatives of the logging industry argued that detailed regulations would not solve the problem. SWEA agreed to shelve the matter for 2 years and industry promised to take necessary measures. In 1996, the Labour Inspectorate investigated the industry's ergonomic initiatives. They found that awareness in combating health problems had increased. However, there was a gap between awareness and the ability to carry out improvements. In 1999, SWEA decided not to regulate working hours but strongly recommended the work teams to use job rotation. A minor follow-up in the year 2000 found work teams with both high production and low health risk, but also more specialised teams.

  8. Neutron and gamma (density) logging in welded tuff

    SciTech Connect

    Lin, W

    1998-09-12

    This Technical Implementation Procedure (TIP) describes the field operation, and the management of data records pertaining to neutron logging and density logging in welded tuff. This procedure applies to all borehole surveys performed in support of Engineered Barrier System Field Tests (EBSFT), including the Earge Block Tests (LBT) and Initial Engineered Barrier System Field Tests (IEBSFT) - WBS 1.2.3.12.4. The purpose of this TIP is to provide guidelines so that other equally trained and qualified personnel can understand how the work is performed or how to repeat the work if needed. The work will be documented by the use of Scientific Notebooks (SNs) as discussed in 033-YMP-QP 3.4. The TIP will provide a set of guidelines which the scientists will take into account in conducting the mea- surements. The use of this TIP does not imply that this is repetitive work that does not require profes- sional judgment.

  9. Potential evapotranspiration and the likelihood of future drought

    NASA Astrophysics Data System (ADS)

    Rind, D.; Goldberg, R.; Hansen, J.; Rosenzweig, C.; Ruedy, R.

    1990-06-01

    The likelihood of future drought is studied on the basis of two drought indices calculated from the Goddard Institute for Space Studies general circulation model (GISS GCM) transient and doubled CO2 climate changes. We use the Palmer drought severity index (PDSI) and a new supply-demand index (SDDI), the latter being the difference between the precipitation and potential evapotranspiration Ep, i.e., the difference between atmospheric supply of and demand for moisture. Both indices show increasing drought for the United States during the next century, with effects becoming apparent in the 1990s. If greenhouse gas emissions continue to increase rapidly, the model results suggest that severe drought (5% frequency today) will occur about 50% of the time by the 2050s. The results are driven by the large increase in EP, associated with the simulated climate warming. EP increases most where the temperature is highest, at low to mid-latitudes, while precipitation increases most where the air is coolest and easiest to saturate by the additional moisture, at higher latitudes. Thus atmospheric demand becomes greater than supply for most of the globe by the latter half of next century. We show that large EP changes can lead to soil moisture deficits, as in the PDSI hydrologic budget, and vegetation desiccation, as is implied by vegetation and climate models. We suggest that drought intensification has been understated in most GCM simulations, including the GISS GCM, because of their lack of realistic land surface models. Paleoclimate analogues in the Mesozoic and Cenozoic are reviewed which imply that arid conditions can result from either increased temperatures or decreased precipitation, consistent with our use of the SDDI. The results depend primarily on the temperature increase, in particular the model sensitivity of 4°C warming for doubled CO2. Global precipitation cannot keep pace with increased demand over land because the land surface warms more than the ocean surface

  10. A likelihood approach to calculating risk support intervals

    SciTech Connect

    Leal, S.M.; Ott, J. )

    1994-05-01

    Genetic risks are usually computed under the assumption that genetic parameters, such as the recombination fraction, are known without error. Uncertainty in the estimates of these parameters must translate into uncertainty regarding the risk. To allow for uncertainties in parameter values, one may employ Bayesian techniques or, in a maximum-likelihood framework, construct a support interval (SI) for the risk. Here the authors have implemented the latter approach. The SI for the risk is based on the SIs of parameters involved in the pedigree likelihood. As an empirical example, the SI for the risk was calculated for probands who are members of chronic spinal muscular atrophy kindreds. In order to evaluate the accuracy of a risk in genetic counseling situations, the authors advocate that, in addition to a point estimate, an SI for the risk should be calculated. 16 refs., 1 fig., 1 tab.

  11. The likelihood term in restoration of transform-compressed imagery

    NASA Astrophysics Data System (ADS)

    Robertson, Mark A.

    2004-05-01

    Compression of imagery by quantization of the data's transform coefficients introduces an error in the imagery upon decompression. When processing compressed imagery, often a likelihood term is used to provide a statistical description of how the observed data are related to the original noise-free data. This work derives the statistical relationship between compressed imagery and the original imagery, which is found to be embodied in a (in general) non-diagonal covariance matrix. Although the derivations are valid for transform coding in general, the work is motivated by considering examples for the specific cases of compression using the discrete cosine transform and the discrete wavelet transform. An example application of motion-compensated temporal filtering is provided to show how the presented likelihood term might be used in a restoration scenario.

  12. Maximum-likelihood registration of range images with missing data.

    PubMed

    Sharp, Gregory C; Lee, Sang W; Wehe, David K

    2008-01-01

    Missing data are common in range images, due to geometric occlusions, limitations in the sensor field of view, poor reflectivity, depth discontinuities, and cast shadows. Using registration to align these data often fails, because points without valid correspondences can be incorrectly matched. This paper presents a maximum likelihood method for registration of scenes with unmatched or missing data. Using ray casting, correspondences are formed between valid and missing points in each view. These correspondences are used to classify points by their visibility properties, including occlusions, field of view, and shadow regions. The likelihood of each point match is then determined using statistical properties of the sensor, such as noise and outlier distributions. Experiments demonstrate a high rates of convergence on complex scenes with varying degrees of overlap. PMID:18000329

  13. A composite likelihood approach for spatially correlated survival data.

    PubMed

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  14. Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution

    SciTech Connect

    Bowman, Kimiko o

    2007-01-01

    The probability generating function of one version of the negative binomial distribution being (p + 1 - pt){sup -k}, we study elements of the Hessian and in particular Fisher's discovery of a series form for the variance of k, the maximum likelihood estimator, and also for the determinant of the Hessian. There is a link with the Psi function and its derivatives. Basic algebra is excessively complicated and a Maple code implementation is an important task in the solution process. Low order maximum likelihood moments are given and also Fisher's examples relating to data associated with ticks on sheep. Efficiency of moment estimators is mentioned, including the concept of joint efficiency. In an Addendum we give an interesting formula for the difference of two Psi functions.

  15. A composite likelihood approach for spatially correlated survival data.

    PubMed

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  16. A composite likelihood approach for spatially correlated survival data

    PubMed Central

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  17. Semidefinite Programming for Approximate Maximum Likelihood Sinusoidal Parameter Estimation

    NASA Astrophysics Data System (ADS)

    Lui, Kenneth W. K.; So, H. C.

    2009-12-01

    We study the convex optimization approach for parameter estimation of several sinusoidal models, namely, single complex/real tone, multiple complex sinusoids, and single two-dimensional complex tone, in the presence of additive Gaussian noise. The major difficulty for optimally determining the parameters is that the corresponding maximum likelihood (ML) estimators involve finding the global minimum or maximum of multimodal cost functions because the frequencies are nonlinear in the observed signals. By relaxing the nonconvex ML formulations using semidefinite programs, high-fidelity approximate solutions are obtained in a globally optimum fashion. Computer simulations are included to contrast the estimation performance of the proposed semi-definite relaxation methods with the iterative quadratic maximum likelihood technique as well as Cramér-Rao lower bound.

  18. Chiral logs and the quenched approximation on the lattice

    NASA Astrophysics Data System (ADS)

    Perrucci, Stefania

    1997-12-01

    The phenomenological success of the quenched approximation has been impressive in obtaining accurate results for the mass spectrum and other aspects of hadron structure. However, an independent theoretical estimate of the systematic error introduced by quenching is still missing. To this end, it can be useful to consider chiral theories which, at the one-loop level, introduced peculiar non analytical terms (the so-called chiral logs). Sharpe showed that when this framework is adapted to the quenched approximation, the dependence of the pion mass squared on the quark mass can be described by an anomalous power δ (the 'hairpin' diagram on the pion mass shell) associated with flavor singlet loops and enhanced chiral logs. Treating the hairpin as a momentum independent mass insertion, he estimated δ≃0.2. However, numerical calculations in quenched lattice quantum chromodynamics show little or no evidence for chiral logs at such a level. The following work determines the anomalous power numerically by studying the pion mass as a function of the bare quark mass, as well as its volume dependence. Sources of systematic error are carefully examined. Last, the coefficient of the chiral log is calculated from the two quark-loops pion propagator. The results consistently indicate a value for the anomalous power that is approximately one order of magnitude smaller than the earlier theoretical estimate, in particular/delta = 0.013(2)From this one can see that for all pion masses considered, the systematic error introduced by quenching is small and always within the statistical error. Finally, by a direct calculation of the topological susceptibility of the lattice configurations, I conclude that the reason why the anomalous power is so small is a strong momentum dependence giving rise to a suppression of the hairpin at momenta comparable with the pion mass.

  19. Maximum likelihood synchronizer for binary overlapping PCM/NRZ signals.

    NASA Technical Reports Server (NTRS)

    Wang, C. D.; Noack, T. L.; Morris, J. F.

    1973-01-01

    A maximum likelihood parameter estimation technique for the self bit synchronization problem is investigated. The input sequence to the bit synchronizer is a sequence of binary overlapping PCM/NRZ signal in the presence of white Gaussian noise with zero mean and known variance. The resulting synchronizer consists of matched filters, a transition device and a weighting function. Finally, the performance is examined by Monte Carlo simulations.

  20. Selecting Aquifer Wells for Planned Gyroscopic Logging

    SciTech Connect

    Rohe, Michael James; Studley, Gregory Wayne

    2002-04-01

    Understanding the configuration of the eastern Snake River Plain aquifer's water table is made difficult, in part, due to borehole deviation in aquifer wells. A borehole has deviation if it is not vertical or straight. Deviation impairs the analysis of water table elevation measurements because it results in measurements that are greater than the true distance from the top of the well to the water table. Conceptual models of the water table configuration are important to environmental management decision-making at the INEEL; these models are based on measurements of depth to the water table taken from aquifer wells at or near the INEEL. When accurate data on the amount of deviation in any given borehole is acquired, then measurements of depth-to-water can be adjusted to reflect the true depth so more accurate conceptual models can be developed. Collection of additional borehole deviation data with gyroscopic logging is planned for selected wells to further our confidence in the quality of water level measurements. Selection of wells for the planned logging is based on qualitative and quantitative screening criteria. An existing data set from magnetic deviation logs was useful in establishing these criteria however, are considered less accurate than gyroscopic deviation logs under certain conditions. Population distributions for 128 aquifer wells with magnetic deviation data were used to establish three quantitative screening thresholds. Qualitative criteria consisted of administrative controls, accessibility issues, and drilling methods. Qualitative criteria eliminated all but 116 of the 337 aquifer wells, in the vicinity of the INEEL, that were initially examined in this screening effort. Of these, 72 have associated magnetic deviation data; 44 do not. Twenty-five (25) of the 72 wells with magnetic deviation data have deviation greater than one of the three quantitative screening thresholds. These 25 are recommended for the planned gyroscopic borehole deviation

  1. Maximum likelihood as a common computational framework in tomotherapy.

    PubMed

    Olivera, G H; Shepard, D M; Reckwerdt, P J; Ruchala, K; Zachman, J; Fitchard, E E; Mackie, T R

    1998-11-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. PMID:9832016

  2. Maximum likelihood tuning of a vehicle motion filter

    NASA Technical Reports Server (NTRS)

    Trankle, Thomas L.; Rabin, Uri H.

    1990-01-01

    This paper describes the use of maximum likelihood parameter estimation unknown parameters appearing in a nonlinear vehicle motion filter. The filter uses the kinematic equations of motion of a rigid body in motion over a spherical earth. The nine states of the filter represent vehicle velocity, attitude, and position. The inputs to the filter are three components of translational acceleration and three components of angular rate. Measurements used to update states include air data, altitude, position, and attitude. Expressions are derived for the elements of filter matrices needed to use air data in a body-fixed frame with filter states expressed in a geographic frame. An expression for the likelihood functions of the data is given, along with accurate approximations for the function's gradient and Hessian with respect to unknown parameters. These are used by a numerical quasi-Newton algorithm for maximizing the likelihood function of the data in order to estimate the unknown parameters. The parameter estimation algorithm is useful for processing data from aircraft flight tests or for tuning inertial navigation systems.

  3. Maximum Likelihood Estimation with Emphasis on Aircraft Flight Data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1985-01-01

    Accurate modeling of flexible space structures is an important field that is currently under investigation. Parameter estimation, using methods such as maximum likelihood, is one of the ways that the model can be improved. The maximum likelihood estimator has been used to extract stability and control derivatives from flight data for many years. Most of the literature on aircraft estimation concentrates on new developments and applications, assuming familiarity with basic estimation concepts. Some of these basic concepts are presented. The maximum likelihood estimator and the aircraft equations of motion that the estimator uses are briefly discussed. The basic concepts of minimization and estimation are examined for a simple computed aircraft example. The cost functions that are to be minimized during estimation are defined and discussed. Graphic representations of the cost functions are given to help illustrate the minimization process. Finally, the basic concepts are generalized, and estimation from flight data is discussed. Specific examples of estimation of structural dynamics are included. Some of the major conclusions for the computed example are also developed for the analysis of flight data.

  4. A maximum-likelihood estimation of pairwise relatedness for autopolyploids

    PubMed Central

    Huang, K; Guo, S T; Shattuck, M R; Chen, S T; Qi, X G; Zhang, P; Li, B G

    2015-01-01

    Relatedness between individuals is central to ecological genetics. Multiple methods are available to quantify relatedness from molecular data, including method-of-moment and maximum-likelihood estimators. We describe a maximum-likelihood estimator for autopolyploids, and quantify its statistical performance under a range of biologically relevant conditions. The statistical performances of five additional polyploid estimators of relatedness were also quantified under identical conditions. When comparing truncated estimators, the maximum-likelihood estimator exhibited lower root mean square error under some conditions and was more biased for non-relatives, especially when the number of alleles per loci was low. However, even under these conditions, this bias was reduced to be statistically insignificant with more robust genetic sampling. We also considered ambiguity in polyploid heterozygote genotyping and developed a weighting methodology for candidate genotypes. The statistical performances of three polyploid estimators under both ideal and actual conditions (including inbreeding and double reduction) were compared. The software package POLYRELATEDNESS is available to perform this estimation and supports a maximum ploidy of eight. PMID:25370210

  5. High precision X-ray log N - log S distributions: implications for the obscured AGN population

    NASA Astrophysics Data System (ADS)

    Mateos, S.; Warwick, R. S.; Carrera, F. J.; Stewart, G. C.; Ebrero, J.; Della Ceca, R.; Caccianiga, A.; Gilli, R.; Page, M. J.; Treister, E.; Tedds, J. A.; Watson, M. G.; Lamer, G.; Saxton, R. D.; Brunner, H.; Page, C. G.

    2008-12-01

    Context: Our knowledge of the properties of AGN, especially those of optical type-2 objects, is very incomplete. Because extragalactic source count distributions are dependent on the cosmological and statistical properties of AGN, they provide a direct method of investigating the underlying source populations. Aims: We aim to constrain the extragalactic source count distributions over a broad range of X-ray fluxes and in various energy bands to test whether the predictions from X-ray background synthesis models agree with the observational constraints provided by our measurements. Methods: We have used 1129 XMM-Newton observations at |b|>20° covering a total sky area of 132.3 deg2 to compile the largest complete samples of X-ray selected objects to date both in the 0.5-1 keV, 1-2 keV, 2-4.5 keV, 4.5-10 keV bands employed in standard XMM-Newton data processing and in the 0.5-2 keV and 2-10 keV energy bands more usually considered in source count studies. Our survey includes in excess of 30 000 sources and spans fluxes from 10-15 to 10-12 erg cm^{-2 s-1} below 2 keV and from 10-14 to 10-12 erg cm^{-2 s-1} above 2 keV where the bulk of the cosmic X-ray background energy density is produced. Results: The very large sample size we obtained means our results are not limited by cosmic variance or low counting statistics. A break in the source count distributions was detected in all energy bands except the 4.5-10 keV band. We find that an analytical model comprising 2 power-law components cannot adequately describe the curvature seen in the source count distributions. The shape of the log N(>S) - log S is strongly dependent on the energy band with a general steepening apparent as we move to higher energies. This is due to the fact that non-AGN populations, comprised mainly of stars and clusters of galaxies, contribute up to 30% of the source population at energies <2 keV and at fluxes ≥10-13 erg cm^{-2 s-1}, and these populations of objects have significantly flatter

  6. Coal log pipeline research at University of Missouri. 3rd quarterly report for 1995, July 1, 1995--September 30, 1995

    SciTech Connect

    Liu, H.

    1995-12-31

    During this quarter (1/1/95-9/30/95), major progress has been made in the following areas of coal log pipeline research, development and technology transfer: (1) Conceptual design of a test machine based on hydraulic presses to mass-produce 5.4-inch-diameter coal logs for testing in a 6-inch-diameter pipeline has been completed. (2) Conceptual design of a rotary-press machine to produce 1.9-inch-diameter coal logs for testing in a 2-inch-diameter pipeline has also been completed. (3) It has been confirmed through experiments that molds with round-edge exit can make logs as good as those made with tapered exit. (4) Conducted a study to determine the effect of surface condition of mold and lubricants on the quality of coal logs. (5) Completed an evaluation of the effect of fiber (wood pulp) on coal log quality. (6) Prepared an apparatus for testing fast compaction of coal logs -- 2 second per log. (7) Compacted coal logs in a 5.3-inch-diameter mold. (8) Completed a preliminary study to assess vacuum and steam heating systems to enhance coal log production and quality. (9) Changed the small-scale-CLP-demo loop from a once-through system to a recirculating system. (10) Completed revision of CLP economic model and revised the 1993 report.

  7. Effects of pitch distance and likelihood on the perceived duration of deviant auditory events.

    PubMed

    Kim, Elisa; McAuley, J Devin

    2013-10-01

    When a deviant (oddball) stimulus is presented within a series of otherwise identical (standard) stimuli, the duration of the oddball tends to be overestimated. Two experiments investigated factors affecting systematic distortions in the perceived duration of oddball stimuli. Both experiments used an auditory oddball paradigm where oddball tones varied in both their pitch distance from the pitch of a standard tone and their likelihood of occurrence. Experiment 1 revealed that (1) far-pitch oddballs were perceived to be longer than near-pitch oddballs, (2) effects of pitch distance were greater in low-likelihood conditions, and (3) oddballs in later serial positions were perceived to be longer than oddballs in earlier serial positions. The above effects held regardless of whether oddballs were higher or lower in pitch than the standard. Experiment 2 revealed a pattern of response times in an oddball detection task that generally paralleled the pattern of data observed in Experiment 1; across conditions, there was a negative correlation between detection times and perceived duration. Taken together, the results suggest that the observed effects of oddball pitch, likelihood, and position on perceived duration are at least partly driven by how quickly individuals are able to initiate timing the oddball following its onset. Implications for different theoretical accounts of the oddball effect are discussed.

  8. The X-ray log N-log S relation. [background radiation in extragalactic media

    NASA Technical Reports Server (NTRS)

    Boldt, Elihu

    1989-01-01

    Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated log N-log S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.

  9. Confidence intervals after multiple imputation: combining profile likelihood information from logistic regressions.

    PubMed

    Heinze, Georg; Ploner, Meinhard; Beyea, Jan

    2013-12-20

    In the logistic regression analysis of a small-sized, case-control study on Alzheimer's disease, some of the risk factors exhibited missing values, motivating the use of multiple imputation. Usually, Rubin's rules (RR) for combining point estimates and variances would then be used to estimate (symmetric) confidence intervals (CIs), on the assumption that the regression coefficients were distributed normally. Yet, rarely is this assumption tested, with or without transformation. In analyses of small, sparse, or nearly separated data sets, such symmetric CI may not be reliable. Thus, RR alternatives have been considered, for example, Bayesian sampling methods, but not yet those that combine profile likelihoods, particularly penalized profile likelihoods, which can remove first order biases and guarantee convergence of parameter estimation. To fill the gap, we consider the combination of penalized likelihood profiles (CLIP) by expressing them as posterior cumulative distribution functions (CDFs) obtained via a chi-squared approximation to the penalized likelihood ratio statistic. CDFs from multiple imputations can then easily be averaged into a combined CDF c , allowing confidence limits for a parameter β  at level 1 - α to be identified as those β* and β** that satisfy CDF c (β*) = α ∕ 2 and CDF c (β**) = 1 - α ∕ 2. We demonstrate that the CLIP method outperforms RR in analyzing both simulated data and data from our motivating example. CLIP can also be useful as a confirmatory tool, should it show that the simpler RR are adequate for extended analysis. We also compare the performance of CLIP to Bayesian sampling methods using Markov chain Monte Carlo. CLIP is available in the R package logistf. PMID:23873477

  10. Log polar image sensor in CMOS technology

    NASA Astrophysics Data System (ADS)

    Scheffer, Danny; Dierickx, Bart; Pardo, Fernando; Vlummens, Jan; Meynants, Guy; Hermans, Lou

    1996-08-01

    We report on the design, design issues, fabrication and performance of a log-polar CMOS image sensor. The sensor is developed for the use in a videophone system for deaf and hearing impaired people, who are not capable of communicating through a 'normal' telephone. The system allows 15 detailed images per second to be transmitted over existing telephone lines. This framerate is sufficient for conversations by means of sign language or lip reading. The pixel array of the sensor consists of 76 concentric circles with (up to) 128 pixels per circle, in total 8013 pixels. The interior pixels have a pitch of 14 micrometers, up to 250 micrometers at the border. The 8013-pixels image is mapped (log-polar transformation) in a X-Y addressable 76 by 128 array.

  11. INSPIRE and SPIRES Log File Analysis

    SciTech Connect

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.

  12. Impacts of intensive logging on the trophic organisation of ant communities in a biodiversity hotspot.

    PubMed

    Woodcock, Paul; Edwards, David P; Newton, Rob J; Vun Khen, Chey; Bottrell, Simon H; Hamer, Keith C

    2013-01-01

    Trophic organisation defines the flow of energy through ecosystems and is a key component of community structure. Widespread and intensifying anthropogenic disturbance threatens to disrupt trophic organisation by altering species composition and relative abundances and by driving shifts in the trophic ecology of species that persist in disturbed ecosystems. We examined how intensive disturbance caused by selective logging affects trophic organisation in the biodiversity hotspot of Sabah, Borneo. Using stable nitrogen isotopes, we quantified the positions in the food web of 159 leaf-litter ant species in unlogged and logged rainforest and tested four predictions: (i) there is a negative relationship between the trophic position of a species in unlogged forest and its change in abundance following logging, (ii) the trophic positions of species are altered by logging, (iii) disturbance alters the frequency distribution of trophic positions within the ant assemblage, and (iv) disturbance reduces food chain length. We found that ant abundance was 30% lower in logged forest than in unlogged forest but changes in abundance of individual species were not related to trophic position, providing no support for prediction (i). However, trophic positions of individual species were significantly higher in logged forest, supporting prediction (ii). Consequently, the frequency distribution of trophic positions differed significantly between unlogged and logged forest, supporting prediction (iii), and food chains were 0.2 trophic levels longer in logged forest, the opposite of prediction (iv). Our results demonstrate that disturbance can alter trophic organisation even without trophically-biased changes in community composition. Nonetheless, the absence of any reduction in food chain length in logged forest suggests that species-rich arthropod food webs do not experience trophic downgrading or a related collapse in trophic organisation despite the disturbance caused by logging

  13. Recording dowel pin position with plastic logs.

    PubMed

    Titus, H W; Draheim, R N

    1986-01-01

    Six readily available plastic materials were evaluated for use as a log for locating the ends of die dowel pins. Seven criteria were divided into two main categories of adherence qualities of the material and useability. Cost was the third category evaluated and compared. Color and availability were also considered. Forty-two sophomore dental students accomplished 84 evaluations. Plasticine and Weatherstrip Caulking Cord were found to meet all criteria best. PMID:3511239

  14. Flow logging applied in fractured rocks

    NASA Astrophysics Data System (ADS)

    Urík, J.; Lukeš, J.; Kobr, M.; Mareš, S.; Williams, J. H.

    2009-04-01

    Fractured zones are usually identified by the basic logging methods but the volume of permeable fractures intersecting boreholes is often too small for their identification. Fluid logging measurements can help to localize natural flow and determine hydraulic properties of fractures during the pumping. Borehole flowmeter measurements or fluid logging measurements are conventional methods used for the determination of quantitative parameters of the flowing fluid in boreholes with the goal to estimate hydraulic conductivity of aquifers. With the increasing interest in the research of fractured formations and their hydraulic properties, new field techniques based on the repeated measurement of the vertical velocity in an observation borehole during the short hydraulic test (hydraulic pulse) in an other borehole of the tested locality have been elaborated. There is a tendency to increase the lateral range of investigation, it means to make the cross-hole tests in boreholes with greater distance. Such cross-hole tests need to make the measurement of the induced vertical flow in the observation borehole with high resolution flowmeters able to measure vertical flow velocities in the range 10-4 m/s and the flow rates in the range 10-6 m3/s. The new photometric flowmeter was developed for this purpose. The results of the photometric flowmeter measurements were compared with the fluid logging measurements and with heat-pulse flowmeter measurements. In our contribution examples of the comparison of measurements mentioned above in two boreholes situated in granite massif in the Czech Republic are demonstrated. Boreholes were drilled in the program of Czech state institution SÚRAO (Radioactive Waste Repository Authority). The research project is financially supported by the Grant Agency of the Czech Republic under the No. 208/07/0777.

  15. Thermal Properties of Bazhen fm. Sediments from Thermal Core Logging

    NASA Astrophysics Data System (ADS)

    Spasennykh, Mikhail; Popov, Evgeny; Popov, Yury; Chekhonin, Evgeny; Romushkevich, Raisa; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Zhukov, Vladislav; Karpov, Igor; Saveliev, Egor; Gabova, Anastasia

    2016-04-01

    The Bazhen formation (B. fm.) is the hugest self-contained source-and-reservoir continuous petroleum system covering by more than 1 mln. km2 (West Siberia, Russia). High lithological differentiation in Bazhen deposits dominated by silicic shales and carbonates accompanied by extremely high total organic carbon values (of up to 35%), pyrite content and brittle mineralogical composition deteriorate standard thermal properties assessment for low permeable rocks. Reliable information of unconventional system thermal characteristics is the necessary part of works such as modelling of different processes in reservoir under thermal EOR for accessing their efficiency, developing and optimizing design of the oil recovery methods, interpretation of the well temperature logging data and for the basin petroleum modelling. A unique set of data including thermal conductivity, thermal diffusivity, volumetric heat capacity, thermal anisotropy for the B.fm. rocks was obtained from thermal core logging (high resolution continuous thermal profiling) on more than 4680 core samples (2000 of B.fm. samples are among) along seven wells for four oil fields. Some systematic peculiarities of the relation between thermal properties of the B.fm. rocks and their mineralogical composition, structural and texture properties were obtained. The high-resolution data are processed jointly with the standard petrophysical logging that allowed us to provide better separation of the formation. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).

  16. Rill Erosion in Post Wildfire Forests after Salvage Logging

    NASA Astrophysics Data System (ADS)

    Robichaud, Peter; Wagenbrenner, Joseph; Brown, Robert

    2016-04-01

    Despite the dominance of concentrated flow or rill erosion in the erosion processes especially in steep forest environments that have been affected by wildfire or management activities few studies have quantified these effects on rill erosion. This study quantified the effects of wildfire and post-fire timber salvage operations on rill runoff quantity, runoff velocity, and rill erosion. Simulated rill experiments were conducted at various sites in the Western US after wildfire and timber salvage operations. The onsite conditions consists of burned only, salvage logged, skid or snig trail, or skid trails with extra logging debris added. For each rill experiment, concentrated flow was applied at the top of the plot through an energy dissipater at five inflow rates for 12 min each. Runoff was sampled every 2 min and runoff volume and sediment concentration were determined for each sample. The runoff velocity was measured using a dyed calcium chloride solution and two conductivity probes placed a known distance apart. Runoff volume, runoff velocities, and sediment concentrations increased with increasing levels of disturbance. The burned only plots had lower runoff rates and sediment concentrations than any of the other disturbances. The salvage logged plots had greater responses than the burn only plots and the mitigation treatment had a marginal effect on runoff ratios, runoff velocities and sediment concentrations. These results suggest that additional disturbance after a wildfire can increase the erosional response and that proper erosion control mitigation may be an important consideration for post fire management to reduce onsite erosion.

  17. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  18. Development of geothermal logging systems in the United States

    SciTech Connect

    Lysne, P.

    1994-04-01

    Logging technologies developed for hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (1) there is a general lack of vetted, high-temperature instrumentation, and (2) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions. Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of (potassium, uranium and thorium) is in the calibration phase, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A fluid sampling tool is in the design phase. All tools are designed for operation at conditions exceeding 400 C, and for deployment in the slim holes produced by mining-coring operations. Partnerships are being formed between the geothermal industry and scientific drilling programs to define and develop inversion algorithms relating raw tool data to more pertinent information. These cooperative efforts depend upon quality guidelines such as those under development within the international Ocean Drilling Program.

  19. An ecosystem model for tropical forest disturbance and selective logging

    NASA Astrophysics Data System (ADS)

    Huang, Maoyi; Asner, Gregory P.; Keller, Michael; Berry, Joseph A.

    2008-03-01

    A new three-dimensional version of the Carnegie-Ames-Stanford Approach (CASA) ecosystem model (CASA-3D) was developed to simulate regional carbon cycling in tropical forest ecosystems after disturbances such as logging. CASA-3D has the following new features: (1) an alternative approach for calculating absorbed photosynthetically active radiation (APAR) using new high-resolution satellite images of forest canopy gap fraction; (2) a pulse disturbance module to modify aboveground carbon pools following forest disturbance; (3) a regrowth module that simulates changes in community composition by considering gap phase regeneration; and (4) a radiative transfer module to simulate the dynamic three-dimensional light environment above the canopy and within gaps after forest disturbance. The model was calibrated with and tested against field observations from experimental logging plots in the Large-scale Biosphere Atmosphere Experiment in Amazonia (LBA) project. The sensitivity of key model parameters was evaluated using Monte Carlo simulations, and the uncertainties in simulated NPP and respiration associated with model parameters and meteorological variables were assessed. We found that selective logging causes changes in forest architecture and composition that result in a cascading set of impacts on the carbon cycling of rainforest ecosystems. Our model sensitivity and uncertainty analyses also highlight the paramount importance of measuring changes in canopy gap fraction from satellite data, as well as canopy light-use efficiency from ecophysiological measurements, to understand the role of forest disturbance on landscape and regional carbon cycling in tropical forests. In sum, our study suggests that CASA-3D may be suitable for regional-scale applications to assess the large-scale effects of selective logging, to provide guidance for forest management, and to understand the role of forest disturbance in regional and global climate studies.

  20. Likelihood ratio and score tests to test the non-inferiority (or equivalence) of the odds ratio in a crossover study with binary outcomes.

    PubMed

    Li, Xiaochun; Li, Huilin; Jin, Man; D Goldberg, Judith

    2016-09-10

    We consider the non-inferiority (or equivalence) test of the odds ratio (OR) in a crossover study with binary outcomes to evaluate the treatment effects of two drugs. To solve this problem, Lui and Chang (2011) proposed both an asymptotic method and a conditional method based on a random effects logit model. Kenward and Jones (1987) proposed a likelihood ratio test (LRTM ) based on a log linear model. These existing methods are all subject to model misspecification. In this paper, we propose a likelihood ratio test (LRT) and a score test that are independent of model specification. Monte Carlo simulation studies show that, in scenarios considered in this paper, both the LRT and the score test have higher power than the asymptotic and conditional methods for the non-inferiority test; the LRT, score, and asymptotic methods have similar power, and they all have higher power than the conditional method for the equivalence test. When data can be well described by a log linear model, the LRTM has the highest power among all the five methods (LRTM , LRT, score, asymptotic, and conditional) for both non-inferiority and equivalence tests. However, in scenarios for which a log linear model does not describe the data well, the LRTM has the lowest power for the non-inferiority test and has inflated type I error rates for the equivalence test. We provide an example from a clinical trial that illustrates our methods. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27095359

  1. Likelihood ratio and score tests to test the non-inferiority (or equivalence) of the odds ratio in a crossover study with binary outcomes.

    PubMed

    Li, Xiaochun; Li, Huilin; Jin, Man; D Goldberg, Judith

    2016-09-10

    We consider the non-inferiority (or equivalence) test of the odds ratio (OR) in a crossover study with binary outcomes to evaluate the treatment effects of two drugs. To solve this problem, Lui and Chang (2011) proposed both an asymptotic method and a conditional method based on a random effects logit model. Kenward and Jones (1987) proposed a likelihood ratio test (LRTM ) based on a log linear model. These existing methods are all subject to model misspecification. In this paper, we propose a likelihood ratio test (LRT) and a score test that are independent of model specification. Monte Carlo simulation studies show that, in scenarios considered in this paper, both the LRT and the score test have higher power than the asymptotic and conditional methods for the non-inferiority test; the LRT, score, and asymptotic methods have similar power, and they all have higher power than the conditional method for the equivalence test. When data can be well described by a log linear model, the LRTM has the highest power among all the five methods (LRTM , LRT, score, asymptotic, and conditional) for both non-inferiority and equivalence tests. However, in scenarios for which a log linear model does not describe the data well, the LRTM has the lowest power for the non-inferiority test and has inflated type I error rates for the equivalence test. We provide an example from a clinical trial that illustrates our methods. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Quantitative Literacy: Working with Log Graphs

    NASA Astrophysics Data System (ADS)

    Shawl, S.

    2013-04-01

    The need for working with and understanding different types of graphs is a common occurrence in everyday life. Examples include anything having to do investments, being an educated juror in a case that involves evidence presented graphically, and understanding many aspect of our current political discourse. Within a science class graphs play a crucial role in presenting and interpreting data. In astronomy, where the range of graphed values is many orders of magnitude, log-axes must be used and understood. Experience shows that students do not understand how to read and interpret log-axes or how they differ from linear. Alters (1996), in a study of college students in an algebra-based physics class, found little understanding of log plotting. The purpose of this poster is to show the method and progression I have developed for use in my “ASTRO 101” class, with the goal being to help students better understand the H-R diagram, mass-luminosity relationship, and digital spectra.

  3. Chiral gravity, log gravity, and extremal CFT

    SciTech Connect

    Maloney, Alexander; Song Wei; Strominger, Andrew

    2010-03-15

    We show that the linearization of all exact solutions of classical chiral gravity around the AdS{sub 3} vacuum have positive energy. Nonchiral and negative-energy solutions of the linearized equations are infrared divergent at second order, and so are removed from the spectrum. In other words, chirality is confined and the equations of motion have linearization instabilities. We prove that the only stationary, axially symmetric solutions of chiral gravity are BTZ black holes, which have positive energy. It is further shown that classical log gravity--the theory with logarithmically relaxed boundary conditions--has finite asymptotic symmetry generators but is not chiral and hence may be dual at the quantum level to a logarithmic conformal field theories (CFT). Moreover we show that log gravity contains chiral gravity within it as a decoupled charge superselection sector. We formally evaluate the Euclidean sum over geometries of chiral gravity and show that it gives precisely the holomorphic extremal CFT partition function. The modular invariance and integrality of the expansion coefficients of this partition function are consistent with the existence of an exact quantum theory of chiral gravity. We argue that the problem of quantizing chiral gravity is the holographic dual of the problem of constructing an extremal CFT, while quantizing log gravity is dual to the problem of constructing a logarithmic extremal CFT.

  4. Reconciling timber extraction with biodiversity conservation in tropical forests using reduced-impact logging

    PubMed Central

    Bicknell, Jake E; Struebig, Matthew J; Davies, Zoe G; Baraloto, Christopher

    2015-01-01

    Over 20% of the world's tropical forests have been selectively logged, and large expanses are allocated for future timber extraction. Reduced-impact logging (RIL) is being promoted as best practice forestry that increases sustainability and lowers CO2 emissions from logging, by reducing collateral damage associated with timber extraction. RIL is also expected to minimize the impacts of selective logging on biodiversity, although this is yet to be thoroughly tested. We undertake the most comprehensive study to date to investigate the biodiversity impacts of RIL across multiple taxonomic groups. We quantified birds, bats and large mammal assemblage structures, using a before-after control-impact (BACI) design across 20 sample sites over a 5-year period. Faunal surveys utilized point counts, mist nets and line transects and yielded >250 species. We examined assemblage responses to logging, as well as partitions of feeding guild and strata (understorey vs. canopy), and then tested for relationships with logging intensity to assess the primary determinants of community composition. Community analysis revealed little effect of RIL on overall assemblages, as structure and composition were similar before and after logging, and between logging and control sites. Variation in bird assemblages was explained by natural rates of change over time, and not logging intensity. However, when partitioned by feeding guild and strata, the frugivorous and canopy bird ensembles changed as a result of RIL, although the latter was also associated with change over time. Bats exhibited variable changes post-logging that were not related to logging, whereas large mammals showed no change at all. Indicator species analysis and correlations with logging intensities revealed that some species exhibited idiosyncratic responses to RIL, whilst abundance change of most others was associated with time. Synthesis and applications. Our study demonstrates the relatively benign effect of reduced

  5. Can log-periodic power law structures arise from random fluctuations?

    NASA Astrophysics Data System (ADS)

    Wosnitza, Jan Henrik; Leker, Jens

    2014-05-01

    Recent research has established log-periodic power law (LPPL) patterns prior to the detonation of the German stock index (DAX) bubble in 1998. The purpose of this article is to explore whether a Langevin equation extracted from real world data can generate synthetic time series with comparable LPPL structures. To this end, we first estimate the stochastic process underlying the DAX log-returns during the period from mid-1997 until end-2003. The employed data set contains about 3.93ṡ106 intraday DAX quotes at a sampling rate of 15 s. Our results indicate that the DAX log-returns can be described as a Markov process. As a consequence, a Langevin equation is derived. Based on this model equation, we run extensive simulations in order to generate 100 synthetic DAX trajectories each covering 3000 trading days. We find LPPL behavior in ten artificial time series. Moreover, we can establish a link between LPPL patterns and ensuing bubble bursts in seven synthetic 600-week windows. However, the LPPL components in most synthetic trajectories differ fundamentally from those LPPL structures that have previously been detected in real financial time series. Summarized, this paper demonstrates that LPPL structures are not necessarily the signature of imitative behavior among investors but can also stem from noise, even though the likelihood of this is extremely low. Thus, our findings confirm with high statistical confidence that the LPPL structures in the DAX development are rooted deeper than only in the random fluctuations of the German stock market.

  6. Changes in soil nitrogen cycling under Norway spruce logging residues on a clear-cut

    NASA Astrophysics Data System (ADS)

    Smolander, Aino; Lindroos, Antti-Jussi; Kitunen, Veikko

    2016-04-01

    In Europe, forest biomass is increasingly being used as a source of energy to replace fossil fuels. In practice, this means that logging residues, consisting of green branches and stem tops, are more commonly harvested. In 2012 logging residues were harvested from about one third of clear-cuts in Finland. Our aim was to study how logging residues affect soil organic matter quality, in particular soil N cycling processes and composition of certain groups of plant secondary compounds, tannins and terpenes. Compounds in these groups were of interest because they are abundant in logging residues, and they have been shown to control soil N cycling. In connection with clear-cutting a Norway spruce stand in southern Finland, we established a controlled field experiment by building logging residue piles (40 kg/m2) on study plots. The piles consisted of fresh spruce branches and tops with green foliage. Control plots with no residues were included (0 kg/m2). Changes in soil organic matter properties have now been monitored for three growing seasons. Logging residues affected organic layer properties strongly. For example, they increased net nitrification and nitrate concentrations. There were also increases in the concentrations of certain terpenes and condensed tannins due to the residues. The significance of logging residues on soil processes and properties will be shown.

  7. Log-periodic route to fractal functions.

    PubMed

    Gluzman, S; Sornette, D

    2002-03-01

    Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables. PMID

  8. Log-periodic route to fractal functions.

    PubMed

    Gluzman, S; Sornette, D

    2002-03-01

    Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables.

  9. On the hydrophobicity of peptides: Comparing empirical predictions of peptide log P values.

    PubMed

    Thompson, Sarah J; Hattotuwagama, Channa K; Holliday, John D; Flower, Darren R

    2006-01-01

    Peptides are of great therapeutic potential as vaccines and drugs. Knowledge of physicochemical descriptors, including the partition coefficient logP, is useful for the development of predictive Quantitative Structure-Activity Relationships (QSARs). We have investigated the accuracy of available programs for the prediction of logP values for peptides with known experimental values obtained from the literature. Eight prediction programs were tested, of which seven programs were fragment-based methods: XLogP, LogKow, PLogP, ACDLogP, AlogP, Interactive Analysis's LogP and MlogP; and one program used a whole molecule approach: QikProp. The predictive accuracy of the programs was assessed using r(2) values, with ALogP being the most effective (r( 2) = 0.822) and MLogP the least (r(2) = 0.090). We also examined three distinct types of peptide structure: blocked, unblocked, and cyclic. For each study (all peptides, blocked, unblocked and cyclic peptides) the performance of programs rated from best to worse is as follows: all peptides - ALogP, QikProp, PLogP, XLogP, IALogP, LogKow, ACDLogP, and MlogP; blocked peptides - PLogP, XLogP, ACDLogP, IALogP, LogKow, QikProp, ALogP, and MLogP; unblocked peptides - QikProp, IALogP, ALogP, ACDLogP, MLogP, XLogP, LogKow and PLogP; cyclic peptides - LogKow, ALogP, XLogP, MLogP, QikProp, ACDLogP, IALogP. In summary, all programs gave better predictions for blocked peptides, while, in general, logP values for cyclic peptides were under-predicted and those of unblocked peptides were over-predicted. PMID:17597897

  10. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford C.

    2011-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.

  11. Risk Assessment Using the Three Dimensions of Probability (Likelihood), Severity, and Level of Control

    NASA Technical Reports Server (NTRS)

    Watson, Clifford

    2010-01-01

    Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls

  12. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    NASA Astrophysics Data System (ADS)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  13. Contribution of logging data to sedimentology and stratigraphy. [Electrofacies

    SciTech Connect

    Serra, O.; Abbott, H.T.

    1982-02-01

    A technique has been developed using multivariate analysis of logging data to give a high-resolution sedimentological description of any sequence of formations. The number of different logs and their range allow determination of many of the physical characteristics of the rock. The definition of a lithofacies has been extended by introducing the concept of an electrofacies, constructed on the basis of all the logging data at any depth interval. Each logging datum is considered a descriptor for purposes of establishing electrofacies in a logged interval. Once established, electrofacies then can be correlated with actual geologic facies, if the logged interval has been cored. 23 refs.

  14. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  15. Progress in rapid compaction of coal logs for freight pipelines

    SciTech Connect

    Gunnink, B.; Li, Wei

    1998-04-01

    The Capsule Pipeline Research Center (CPRC) at the University of Missouri-Columbia is devoted to performing research in capsule pipelines. Since its inception in 1991, the CPRC has focused on research related to the development and rapid commercialization of coal log pipeline technology. Coal log pipelines are freight pipelines that will transport compacted coal through a water filled pipeline. To fully develop this technology and make it ready for commercial use it is necessary to investigate means for fabricating coal logs. This paper describes research progress on the rapid compaction of coal logs for coal log pipeline transport. Economic studies conducted by the CPRC indicate that the cost of producing coal logs and thus the economic competitiveness of coal log pipelines is directly related to the compaction the necessary to make the coal logs. Previous research has demonstrated the ability to make laboratory scale coal logs with a 5 second compaction times. It was also observed that for rapidly compacted coal logs, coal log circulation performance (resistance to abrasion in a commercial pipeline) is maximized, if the logs are compacted from a coal mixture at an optimal moisture content. For the bituminous Mettiki coal that has been studied, this optimal moisture content is about 9%, if the compaction time is 5 seconds. The practical application of this is that if stockpiled coal is wetter than optimum, it would require drying of the coal to compact it at the optimum moisture content. Obviously, this would effect the cost of coal log fabrication.

  16. Thresholds of logging intensity to maintain tropical forest biodiversity.

    PubMed

    Burivalova, Zuzana; Sekercioğlu, Cağan Hakkı; Koh, Lian Pin

    2014-08-18

    Primary tropical forests are lost at an alarming rate, and much of the remaining forest is being degraded by selective logging. Yet, the impacts of logging on biodiversity remain poorly understood, in part due to the seemingly conflicting findings of case studies: about as many studies have reported increases in biodiversity after selective logging as have reported decreases. Consequently, meta-analytical studies that treat selective logging as a uniform land use tend to conclude that logging has negligible effects on biodiversity. However, selectively logged forests might not all be the same. Through a pantropical meta-analysis and using an information-theoretic approach, we compared and tested alternative hypotheses for key predictors of the richness of tropical forest fauna in logged forest. We found that the species richness of invertebrates, amphibians, and mammals decreases as logging intensity increases and that this effect varies with taxonomic group and continental location. In particular, mammals and amphibians would suffer a halving of species richness at logging intensities of 38 m(3) ha(-1) and 63 m(3) ha(-1), respectively. Birds exhibit an opposing trend as their total species richness increases with logging intensity. An analysis of forest bird species, however, suggests that this pattern is largely due to an influx of habitat generalists into heavily logged areas while forest specialist species decline. Our study provides a quantitative analysis of the nuanced responses of species along a gradient of logging intensity, which could help inform evidence-based sustainable logging practices from the perspective of biodiversity conservation.

  17. Field Demonstration of Slim-hole Borehole Nuclear Magnetic Resonance (NMR) Logging Tool for Groundwater Investigations

    NASA Astrophysics Data System (ADS)

    Walsh, D.; Turner, P.; Frid, I.; Shelby, R.; Grunewald, E. D.; Magnuson, E.; Butler, J. J.; Johnson, C. D.; Cannia, J. C.; Woodward, D. A.; Williams, K. H.; Lane, J. W.

    2010-12-01

    Nuclear magnetic resonance (NMR) methods provide estimates of free and bound water content and hydraulic conductivity, which are critically important for groundwater investigations. Borehole NMR tools have been available and widely used in the oil industry for decades, but only recently have been designed for small diameter boreholes typical of groundwater investigations. Field tests of an 89-mm-diameter borehole NMR logging tool are presented. This borehole NMR logging tool was developed for economical NMR logging of 100- to 200-mm-diameter boreholes, and specifically for characterizing hydraulic properties in the top 200 m of the subsurface. The tool has a vertical resolution of 0.5 m, a minimum echo spacing of 2.0 ms, and a radial depth of investigation of 178 to 203 mm, which typically is beyond the annulus of observation wells. It takes about 15 minutes to collect a data sample for each 0.5-m interval. The borehole NMR logging tool was field tested during spring 2010, in PVC-cased wells at sites in East Haddam and Storrs, Connecticut; Cape Cod, Massachusetts; Lexington, Nebraska; Lawrence, Kansas; and Rifle, Colorado. NMR logging yielded estimates of bound water, free water, and total-water content, as well as continuous distributions of water content versus transverse relaxation time (T2) at all depth levels. The derived water-content data were compared to the available ground-truth hydrogeologic data from each well, including drilling logs, neutron and other geophysical logs, and direct measurements of hydraulic conductivity. The results indicate that the borehole NMR logging tool provides information on porosity, pore-size distribution, and estimated hydraulic conductivity that cannot be duplicated by any other single geophysical logging tool.

  18. Likelihood of burrow flow in Canadian agricultural lands

    NASA Astrophysics Data System (ADS)

    Dadfar, Humaira; Allaire, Suzanne E.; van Bochove, Eric; Denault, Jean-Thomas; Thériault, Georges; Charles, Anaïs

    2010-05-01

    SummaryIndicators of risk of water contamination (IROWCs) by agricultural contaminants are developed to assess sustainability of agriculture. Burrow flow ( BF) is part of the transport hydrology algorithm used in IROWCs since it is a key pathway for sub-surface contaminant transport. The objectives of this study were to develop a methodology for predicting the likelihood of BF occurrence in agricultural soils across Canada at the landscape scale, and to determine its variation over a 25-year period (1981-2006). The BF algorithm considers the influence of climate, soil properties, and soil management on the likely frequency of BF and distribution of burrows ( B) made by Lumbricus terrestris L. Nova Scotia, Prince Edward Island, Ontario, Quebec, followed by New Brunswick, had the highest likelihood of BF due to favourable humidity, sufficient heat, medium-textured soils, and strong runoff during the growing season and spring thaw. Alberta and Saskatchewan are too dry to favour BF. Areas with high risk of BF fall within locations of high potential for lateral flow due to shallow soils, or to the presence of tile drainage, which may connect BF pathways to important water bodies such as the Great Lakes and the St-Lawrence River. Sensitivity analyses on threshold values used in the BF algorithm indicated that Manitoba is the most sensitive province to changes in precipitation, Quebec to temperature, Prince Edward Island to soil depth, and Ontario to manure application. The BF algorithm can be used as a simple tool to predict the likelihood of water and contaminant transport along earthworm burrows with data available across Canada. It will be upgraded with new data (e.g. climate change) and with an improved algorithm after statistical analyses and correlations with actual water quality data.

  19. MAXIMUM LIKELIHOOD ESTIMATION FOR PERIODIC AUTOREGRESSIVE MOVING AVERAGE MODELS.

    USGS Publications Warehouse

    Vecchia, A.V.

    1985-01-01

    A useful class of models for seasonal time series that cannot be filtered or standardized to achieve second-order stationarity is that of periodic autoregressive moving average (PARMA) models, which are extensions of ARMA models that allow periodic (seasonal) parameters. An approximation to the exact likelihood for Gaussian PARMA processes is developed, and a straightforward algorithm for its maximization is presented. The algorithm is tested on several periodic ARMA(1, 1) models through simulation studies and is compared to moment estimation via the seasonal Yule-Walker equations. Applicability of the technique is demonstrated through an analysis of a seasonal stream-flow series from the Rio Caroni River in Venezuela.

  20. STECKMAP: STEllar Content and Kinematics via Maximum A Posteriori likelihood

    NASA Astrophysics Data System (ADS)

    Ocvirk, P.

    2011-08-01

    STECKMAP stands for STEllar Content and Kinematics via Maximum A Posteriori likelihood. It is a tool for interpreting galaxy spectra in terms of their stellar populations through the derivation of their star formation history, age-metallicity relation, kinematics and extinction. The observed spectrum is projected onto a temporal sequence of models of single stellar populations, so as to determine a linear combination of these models that best fits the observed spectrum. The weights of the various components of this linear combination indicate the stellar content of the population. This procedure is regularized using various penalizing functions. The principles of the method are detailed in Ocvirk et al. 2006.

  1. Efficient maximum likelihood parameterization of continuous-time Markov processes

    PubMed Central

    McGibbon, Robert T.; Pande, Vijay S.

    2015-01-01

    Continuous-time Markov processes over finite state-spaces are widely used to model dynamical processes in many fields of natural and social science. Here, we introduce a maximum likelihood estimator for constructing such models from data observed at a finite time interval. This estimator is dramatically more efficient than prior approaches, enables the calculation of deterministic confidence intervals in all model parameters, and can easily enforce important physical constraints on the models such as detailed balance. We demonstrate and discuss the advantages of these models over existing discrete-time Markov models for the analysis of molecular dynamics simulations. PMID:26203016

  2. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots. PMID:21265444

  3. Assessment of seal quality for potential storage sites in the Norwegian North Sea from well log data

    NASA Astrophysics Data System (ADS)

    Angeli, M.; Gabrielsen, R. H.; Faleide, J. I.

    2012-04-01

    Fluid migration through caprocks is a crucial process when it comes to evaluate their sealing capacity for underground CO2 storage. Migration mechanisms such as flow through fault systems or along wells are quite easily identified by their relatively large size and because these features can be monitored by the use of reflection seismic data or well logs. However, microcracks in rocks, which can allegedly cause fluid migration through tight rocks, are difficult to detect from large scale observations and can only be deduced from thorough investigation. The objective of this work is to evaluate the likelihood of microfracture networks in potential seals (shales) through the analysis of well log data. This study focuses on the Upper Jurassic and Lower Cretaceous shale succession in the Norwegian North Sea. The main target of the study is the Draupne Formation (upper Jurassic) found in the Horda Platform / Viking Graben area. It has been deposited syn-rift during the second episode of the Viking graben formation in the Upper Jurassic, and thus has a burial depth ranging from 914 to 4573 m. This formation is identified in well logs by its sharp decrease in ultrasonic velocity and density, and specifically high resistivity and gamma ray readings. Other studied shale formations include the rest of the Viking Group (Heather Formation), the Tyne Group in the Central Graben (Farsund, Haugesund and Mandal) and the Boknfjord Group in the Norwegian-Danish Basin (Egersund, Flekkefjord, Sauda and Tau). Public well log data from 104 boreholes in the Norwegian sector of the North Sea have been analyzed and among them, 87 had a complete set of logs that are necessary for our analysis: ultrasonic velocities, gamma ray, density and resistivity. This study illustrates that the first-order variation of the ultrasonic velocity for the Draupne Formation in the Norwegian North Sea is of course due to depth. Diagenesis, whether mechanical or chemical, stiffens the rock by strengthening the

  4. Integrated analysis of well logs and seismic data to estimate gas hydrate concentrations at Keathley Canyon, Gulf of Mexico

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2008-01-01

    Accurately detecting and quantifying gas hydrate or free gas in sediments from seismic data require downhole well-log data to calibrate the physical properties of the gas hydrate-/free gas-bearing sediments. As part of the Gulf of Mexico Joint Industry Program, a series of wells were either cored or drilled in the Gulf of Mexico to characterize the physical properties of gas hydrate-bearing sediments, to calibrate geophysical estimates, and to evaluate source and transport mechanisms for gas within the gas hydrates. Downhole acoustic logs were used sparingly in this study because of degraded log quality due to adverse wellbore conditions. However, reliable logging while drilling (LWD) electrical resistivity and porosity logs were obtained. To tie the well-log information to the available 3-D seismic data in this area, a velocity log was calculated from the available resistivity log at the Keathley Canyon 151-2 well, because the acoustic log or vertical seismic data acquired at the nearby Keathley Canyon 151-3 well were either of poor quality or had limited depth coverage. Based on the gas hydrate saturations estimated from the LWD resistivity log, the modified Biot-Gassmann theory was used to generate synthetic acoustic log and a synthetic seismogram was generated with a fairly good agreement with a seismic profile crossing the well site. Based on the well-log information, a faintly defined bottom-simulating reflection (BSR) in this area is interpreted as a reflection representing gas hydrate-bearing sediments with about 15% saturation overlying partially gas-saturated sediments with 3% saturation. Gas hydrate saturations over 30-40% are estimated from the resistivity log in two distinct intervals at 220-230 and 264-300 m below the sea floor, but gas hydrate was not physically recovered in cores. It is speculated that the poor recovery of cores and gas hydrate morphology are responsible for the lack of physical gas hydrate recovery.

  5. Conceptual and direct replications fail to support the stake-likelihood hypothesis as an explanation for the interdependence of utility and likelihood judgments.

    PubMed

    de Molière, Laura; Harris, Adam J L

    2016-04-01

    Previous research suggests that people systematically overestimate the occurrence of both positive and negative events, compared with neutral future events, and that these biases are due to a misattribution of arousal elicited by utility (stake-likelihood hypothesis; SLH; Vosgerau, 2010). However, extant research has provided only indirect support for these arousal misattribution processes. In the present research, we initially aimed to provide a direct test of the SLH by measuring arousal with galvanic skin responses to examine the mediating role of arousal. We observed no evidence that measured arousal mediated the impact of utility on probability estimates. Given the lack of direct support for the SLH in Experiment 1, Experiments 2-5 aimed to assess the SLH by replicating some of the original findings that provided support for arousal misattribution as a mechanism. Despite our best efforts to create experimental conditions under which we would be able to demonstrate the stake-likelihood effect, we were unable to replicate previous results, with a Bayesian meta-analysis demonstrating support for the null hypothesis. We propose that accounts based on imaginability and loss function asymmetry are currently better candidate explanations for the influence of outcome utility on probability estimates.

  6. The galaxy as the origin of gamma-ray bursts. II - The effect of an intrinsic burst luminosity distribution on log N/greater than S/ versus log S

    NASA Technical Reports Server (NTRS)

    Jennings, M. C.

    1982-01-01

    The effect of intrinsic burst luminosity distributions on log N-log S has been investigated for halo and disk galactic sources, using a power-law burst luminosity distribution characterized by a shape exponent and a luminosity range. Properly chosen burst luminosity distributions are shown to produce the log N-log S relations sufficiently different from those obtained with monoluminosity bursts to yield halo and thick disk geometries which are compatible with the observations. The soft shape of luminosity distributions capable of altering log N-log S has such observable consequences as (1) gamma-ray burst expected repetitions which first become apparent at lower fluences, and (2) the occurrence of most phenomena at luminosities on the order of about 10 to the 39th ergs, and correspondingly lower fluences.

  7. Logging technology for high-temperature geothermal boreholes

    SciTech Connect

    Dennis, B.R.

    1984-05-01

    Research in materials, equipment, and instrument development was required in the Hot Dry Rock Energy Extraction Demonstration at Fenton Hill located in northern New Mexico. Results of this extensive development advanced the logging technology in geothermal boreholes to present state-of-the art. The new Phase II Energy Extraction System at the Fenton Hill Test Site will consist of two wellbores drilled to a depth of about 4570 m (15,000 ft) and then connected by a series of hydraulic-induced fractures. The first borehole (EE-2) was completed in May of 1980 at a depth of 4633 m (15,200 ft) of which approximately 3960 m (13,000 ft) is in Precambrian granitic rock. Starting at a depth of approximately 2930 m (9600 ft), the borehole was inclined up to 35/sup 0/ from vertical. Bottom-hole temperature in EE-2 is 320/sup 0/C. The EE-3 borehole was then drilled to a depth of 4236 m (13,900 ft). Its inclined part is positioned directly over the EE-2 wellbore with a vertical separation of about 450 m (1500 ft) between them. Many of the geophysical measurements needed to develop the hot dry rock concept are unique. Most of the routine instruments used in petroleum drilling fail in the hot and abrasive environment. New equipment developed includes not only the downhole sonde that houses the transducer and associated line driving electronics, but modifications also were needed on the entire data retrieval systems and associated data analysis technology. Successful performance of wellbore surveys in the EE-2 and EE-3 boreholes depended upon the capacity of the sensors, instrument sonde, cablehead, and armored logging cable to work in this severe environment. The major areas of materials development for surveying the boreholes in the high-temperature environment were on elastomeric seals, electrical insulation for logging cables, downhole sensors, and associated downhole electronic and electro-mechanical components.

  8. 4. Exterior, detail south elevation, showing jointing of logs on ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Exterior, detail south elevation, showing jointing of logs on later extension. Sept. 12, 1940. Mixon. - Upper Swedish Log Cabin, Darby Creek vicinity, Clifton Heights (Upper Darby Township), Darby, Delaware County, PA

  9. 8. William E. Barrett, Photographer, August 1975. LOG DOCK AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. William E. Barrett, Photographer, August 1975. LOG DOCK AND PARTIALLY DEMOLISHED JACKSLIP USED FOR HAULING LOGS UP TO SAWMILL. - Meadow River Lumber Company, Highway 60, Rainelle, Greenbrier County, WV

  10. Well log evaluation of natural gas hydrates

    SciTech Connect

    Collett, T.S.

    1992-10-01

    Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence? Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

  11. Well log evaluation of natural gas hydrates

    SciTech Connect

    Collett, T.S.

    1992-10-01

    Gas hydrates are crystalline substances composed of water and gas, in which a solid-water-lattice accommodates gas molecules in a cage-like structure. Gas hydrates are globally widespread in permafrost regions and beneath the sea in sediment of outer continental margins. While methane, propane, and other gases can be included in the clathrate structure, methane hydrates appear to be the most common in nature. The amount of methane sequestered in gas hydrates is probably enormous, but estimates are speculative and range over three orders of magnitude from about 100,000 to 270,000,000 trillion cubic feet. The amount of gas in the hydrate reservoirs of the world greedy exceeds the volume of known conventional gas reserves. Gas hydrates also represent a significant drilling and production hazard. A fundamental question linking gas hydrate resource and hazard issues is: What is the volume of gas hydrates and included gas within a given gas hydrate occurrence Most published gas hydrate resource estimates have, of necessity, been made by broad extrapolation of only general knowledge of local geologic conditions. Gas volumes that may be attributed to gas hydrates are dependent on a number of reservoir parameters, including the areal extent ofthe gas-hydrate occurrence, reservoir thickness, hydrate number, reservoir porosity, and the degree of gas-hydrate saturation. Two of the most difficult reservoir parameters to determine are porosity and degreeof gas hydrate saturation. Well logs often serve as a source of porosity and hydrocarbon saturation data; however, well-log calculations within gas-hydrate-bearing intervals are subject to error. The primary reason for this difficulty is the lack of quantitative laboratory and field studies. The primary purpose of this paper is to review the response of well logs to the presence of gas hydrates.

  12. Apparatus for focused electrode induced polarization logging

    SciTech Connect

    Vinegar, H.J.; Waxman, M.H.

    1986-04-15

    An induced polarization logging tool is described for measuring parameters of a formation surrounding a borehole. The logging tool consists of: a non-conductive logging sonde; a plurality of electrodes disposed on the sonde, the electrodes including at least a survey current electrode and guard electrodes disposed on opposite sides of the survey current electrode, a non-polarizing voltage measuring electrode, a non-polarizing voltage reference electrode and a current return electrode, both the voltage reference and current return electrodes being located a greater distance from the survey current electrode than the guard electrodes; means connected to the survey current electrode and the guard electrodes for generating a signal representative of the potential difference in the formation between the survey current electrode and the guard electrodes; first control means directly coupled to the survey current electrode, the first control means controlling the current flow to the survey current electrode in response to the potential difference signal; a second control means directly coupled to the guard electrodes to control the current flow to the guard electrodes in response to the potential difference signal; a source of alternating current located at the surface, one end of the source being coupled to the two control means and the other to the current return electrode, the source supplying alternating current at various discrete frequencies between substantially 0.01 and 100 Hz; measurement means directly coupled to the voltage measurement and survey current electrodes to measure the amplitude and phase of the voltage induced in the formation and the amplitude and phase of the current flow to the survey electrode; and transmission means for transmitting the measurements to the surface.

  13. Log amplifier with pole-zero compensation

    DOEpatents

    Brookshier, William

    1987-01-01

    A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifier circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedback loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point or pole is offset by a compensating break point or zero.

  14. Calibration Tests of a German Log Rodmeter

    NASA Technical Reports Server (NTRS)

    Mottard, Elmo J.; Stillman, Everette R.

    1949-01-01

    A German log rodmeter of the pitot static type was calibrated in Langley tank no. 1 at speeds up to 34 knots and angles of yaw from 0 deg to plus or minus 10 3/4 degrees. The dynamic head approximated the theoretical head at 0 degrees yaw but decreased as the yaw was increased. The static head was negative and in general became more negative with increasing speed and yaw. Cavitation occurred at speeds above 31 knots at 0 deg yaw and 21 knots at 10 3/4 deg yaw.

  15. Log amplifier with pole-zero compensation

    DOEpatents

    Brookshier, W.

    1985-02-08

    A logarithmic amplifier circuit provides pole-zero compensation for improved stability and response time over 6-8 decades of input signal frequency. The amplifer circuit includes a first operational amplifier with a first feedback loop which includes a second, inverting operational amplifier in a second feedstock loop. The compensated output signal is provided by the second operational amplifier with the log elements, i.e., resistors, and the compensating capacitors in each of the feedback loops having equal values so that each break point is offset by a compensating break point or zero.

  16. Identifying related journals through log analysis

    PubMed Central

    Lu, Zhiyong; Xie, Natalie; Wilbur, W. John

    2009-01-01

    Motivation: With the explosion of biomedical literature and the evolution of online and open access, scientists are reading more articles from a wider variety of journals. Thus, the list of core journals relevant to their research may be less obvious and may often change over time. To help researchers quickly identify appropriate journals to read and publish in, we developed a web application for finding related journals based on the analysis of PubMed log data. Availability: http://www.ncbi.nlm.nih.gov/IRET/Journals Contact: luzh@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19734155

  17. VAFLE: visual analytics of firewall log events

    NASA Astrophysics Data System (ADS)

    Ghoniem, Mohammad; Shurkhovetskyy, Georgiy; Bahey, Ahmed; Otjacques, Benoît.

    2013-12-01

    In this work, we present VAFLE, an interactive network security visualization prototype for the analysis of firewall log events. Keeping it simple yet effective for analysts, we provide multiple coordinated interactive visualizations augmented with clustering capabilities customized to support anomaly detection and cyber situation awareness. We evaluate the usefulness of the prototype in a use case with network traffic datasets from previous VAST Challenges, illustrating its effectiveness at promoting fast and well-informed decisions. We explain how a security analyst may spot suspicious traffic using VAFLE. We further assess its usefulness through a qualitative evaluation involving network security experts, whose feedback is reported and discussed.

  18. Nonlinear filters with log-homotopy

    NASA Astrophysics Data System (ADS)

    Daum, Fred; Huang, Jim

    2007-09-01

    We derive and test a new nonlinear filter that implements Bayes' rule using an ODE rather than with a pointwise multiplication of two functions. This avoids one of the fundamental and well known problems in particle filters, namely "particle collapse" as a result of Bayes' rule. We use a log-homotopy to construct this ODE. Our new algorithm is vastly superior to the classic particle filter, and we do not use any proposal density supplied by an EKF or UKF or other outside source. This paper was written for normal engineers, who do not have homotopy for breakfast.

  19. A Longitudinal Causal Analysis of Impact Made by Collaborative Engagement and Service Receipt on Likelihood of Substantiated Re-Report.

    PubMed

    Cheng, Tyrone C; Lo, Celia C

    2015-11-01

    In a temporal-ordering causal analysis, we explored impacts of parent's receipt of services and caseworker-parent collaborative engagement on substantiated child maltreatment re-report. Our sample of 2,368 parents came from the National Survey of Child and Adolescent Well-Being. Using generalized estimating equations, we found likelihood of substantiated re-report to drop significantly in association with collaborative engagement, length of current in-home spell, and family income. In turn, parental alcohol dependence, initial substantiated sexual maltreatment, and prior in-home spells were linked to increased likelihood. Practice implications are discussed. PMID:26224742

  20. Outcrop gamma-ray logging applied to subsurface petroleum geology

    SciTech Connect

    Slatt, R.M.; Borer, J.M.; Horn, B.W.

    1995-10-01

    Developing a gamma-ray log profile of an outcrop with a hand-held scintillometer has many applications to subsurface petroleum geology. The outcrop gamma-ray log provides a readily understandable bridge between what is observed in outcrop and what is to be interpreted on well logs and seismic records. Several examples are presented in this paper that demonstrate major applications. An outcrop from the Cretaceous Mesaverde Group in Colorado provides an excellent example of the use of outcrop gamma-ray logs to better visualize spatial variability of depositional settings for improved well log correlations. Out crops from the Cretaceous Almond Formation, Niobrara Formation, and Graneros Shale in Colorado serve as examples of outcrop gamma-ray logging used to correlate outcrops with their subsurface equivalents for improved lithologic and stratigraphic interpretation of well logs. Outcrops of the Cretaceous Sharon Springs Member of the Pierre Shale in Colorado and the Eocene Green River Formation in Wyoming provide examples of the application of outcrop-gamma ray logging to identify and characterize organic-rich shales in outcrops and on well logs. Outcrops of the Pennsylvanian Jackfork Formation in Arkansas demonstrate the use of outcrop logging to yield improved interpretation of reservoir quality on well logs and for one- and two-dimensional seismic modeling. An outcrop of Precambrian and Cambro-Ordovician rocks from Algeria provides an example of outcrop logging to recognize unconformities and other major surfaces on well logs. An outcrop of the Niobrara Formation in Colorado is used as an example for improved understanding of horizontal gamma-ray log response. The example logs presented are all drived with a hand-held scintillometer. This technique is simple, quick, and relatively inexpensive, so is recommended for any outcrop work that is intended to be applied t;o subsurface well logs or seismic interpretation.