Sample records for leading log approximation

  1. Performance analysis of MIMO wireless optical communication system with Q-ary PPM over correlated log-normal fading channel

    NASA Astrophysics Data System (ADS)

    Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua

    2018-06-01

    The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.

  2. A Space-Saving Approximation Algorithm for Grammar-Based Compression

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hiroshi; Maruyama, Shirou; Kida, Takuya; Shimozono, Shinichi

    A space-efficient approximation algorithm for the grammar-based compression problem, which requests for a given string to find a smallest context-free grammar deriving the string, is presented. For the input length n and an optimum CFG size g, the algorithm consumes only O(g log g) space and O(n log*n) time to achieve O((log*n)log n) approximation ratio to the optimum compression, where log*n is the maximum number of logarithms satisfying log log…log n > 1. This ratio is thus regarded to almost O(log n), which is the currently best approximation ratio. While g depends on the string, it is known that g =Ω(log n) and g=\\\\Omega(\\\\log n) and g=O\\\\left(\\\\frac{n}{log_kn}\\\\right) for strings from k-letter alphabet[12].

  3. Spin polarized photons from an axially charged plasma at weak coupling: Complete leading order

    DOE PAGES

    Mamo, Kiminad A.; Yee, Ho-Ung

    2016-03-24

    In the presence of (approximately conserved) axial charge in the QCD plasma at finite temperature, the emitted photons are spin aligned, which is a unique P- and CP-odd signature of axial charge in the photon emission observables. We compute this “P-odd photon emission rate” in a weak coupling regime at a high temperature limit to complete leading order in the QCD coupling constant: the leading log as well as the constant under the log. As in the P-even total emission rate in the literature, the computation of the P-odd emission rate at leading order consists of three parts: (1) Comptonmore » and pair annihilation processes with hard momentum exchange, (2) soft t- and u-channel contributions with hard thermal loop resummation, (3) Landau-Pomeranchuk-Migdal resummation of collinear bremsstrahlung and pair annihilation. In conclusion, we present analytical and numerical evaluations of these contributions to our P-odd photon emission rate observable.« less

  4. HOW WELL ARE HYDRAULIC CONDUCTIVITY VARIATIONS APPROXIMATED BY ADDITIVE STABLE PROCESSES? (R826171)

    EPA Science Inventory

    Abstract

    Analysis of the higher statistical moments of a hydraulic conductivity (K) and an intrinsic permeability (k) data set leads to the conclusion that the increments of the data and the logs of the data are not governed by Levy-stable or Gaussian dis...

  5. Measurement of the k(T) distribution of particles in jets produced in pp collisions at sqrt(s)=1.96 TeV.

    PubMed

    Aaltonen, T; Adelman, J; Akimoto, T; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burke, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Cordelli, M; Cortiana, G; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heijboer, A; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Hussein, M; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, H S; Lee, S W; Leone, S; Lewis, J D; Lin, C-S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lucchesi, D; Luci, C; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlok, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Pagan Griso, S; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Würthwein, F; Wynne, S M; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zhang, X; Zheng, Y; Zucchelli, S

    2009-06-12

    We present a measurement of the transverse momentum with respect to the jet axis (k(t)) of particles in jets produced in pp collisions at sqrt(s)=1.96 TeV. Results are obtained for charged particles in a cone of 0.5 radians around the jet axis in events with dijet invariant masses between 66 and 737 GeV/c(2). The experimental data are compared to theoretical predictions obtained for fragmentation partons within the framework of resummed perturbative QCD using the modified leading log and next-to-modified leading log approximations. The comparison shows that trends in data are successfully described by the theoretical predictions, indicating that the perturbative QCD stage of jet fragmentation is dominant in shaping basic jet characteristics.

  6. Analyzing indicator microorganisms, antibiotic resistant Escherichia coli, and regrowth potential of foodborne pathogens in various organic fertilizers.

    PubMed

    Miller, Cortney; Heringa, Spencer; Kim, Jinkyung; Jiang, Xiuping

    2013-06-01

    This study analyzed various organic fertilizers for indicator microorganisms, pathogens, and antibiotic-resistant Escherichia coli, and evaluated the growth potential of E. coli O157:H7 and Salmonella in fertilizers. A microbiological survey was conducted on 103 organic fertilizers from across the United States. Moisture content ranged from approximately 1% to 86.4%, and the average pH was 7.77. The total aerobic mesophiles ranged from approximately 3 to 9 log colony-forming units (CFU)/g. Enterobacteriaceae populations were in the range of <1 to approximately 7 log CFU/g, while coliform levels varied from <1 to approximately 6 log CFU/g. Thirty samples (29%) were positive for E. coli, with levels reaching approximately 6 log CFU/g. There were no confirmed positives for E. coli O157:H7, Salmonella, or Listeria monocytogenes. The majority of E. coli isolates (n=73), confirmed by glutamate decarboxylase (gad) PCR, were from group B1 (48%) and group A (32%). Resistance to 16 antibiotics was examined for 73 E. coli isolates, with 11 isolates having resistance to at least one antibiotic, 5 isolates to ≥ 2 antibiotics, and 2 isolates to ≥ 10 antibiotics. In the presence of high levels of background aerobic mesophiles, Salmonella and E. coli O157:H7 grew approximately 1 log CFU/g within 1 day of incubation in plant-based compost and fish emulsion-based compost, respectively. With low levels of background aerobic mesophiles, Salmonella grew approximately 2.6, 3.0, 3.0, and 3.2 log CFU/g in blood, bone, and feather meals and the mixed-source fertilizer, respectively, whereas E. coli O157:H7 grew approximately 4.6, 4.0, 4.0, and 4.8 log CFU/g, respectively. Our results revealed that the microbiological quality of organic fertilizers varies greatly, with some fertilizers containing antibiotic resistant E. coli and a few supporting the growth of foodborne pathogens after reintroduction into the fertilizer.

  7. Accurate computation of survival statistics in genome-wide studies.

    PubMed

    Vandin, Fabio; Papoutsaki, Alexandra; Raphael, Benjamin J; Upfal, Eli

    2015-05-01

    A key challenge in genomics is to identify genetic variants that distinguish patients with different survival time following diagnosis or treatment. While the log-rank test is widely used for this purpose, nearly all implementations of the log-rank test rely on an asymptotic approximation that is not appropriate in many genomics applications. This is because: the two populations determined by a genetic variant may have very different sizes; and the evaluation of many possible variants demands highly accurate computation of very small p-values. We demonstrate this problem for cancer genomics data where the standard log-rank test leads to many false positive associations between somatic mutations and survival time. We develop and analyze a novel algorithm, Exact Log-rank Test (ExaLT), that accurately computes the p-value of the log-rank statistic under an exact distribution that is appropriate for any size populations. We demonstrate the advantages of ExaLT on data from published cancer genomics studies, finding significant differences from the reported p-values. We analyze somatic mutations in six cancer types from The Cancer Genome Atlas (TCGA), finding mutations with known association to survival as well as several novel associations. In contrast, standard implementations of the log-rank test report dozens-hundreds of likely false positive associations as more significant than these known associations.

  8. Accurate Computation of Survival Statistics in Genome-Wide Studies

    PubMed Central

    Vandin, Fabio; Papoutsaki, Alexandra; Raphael, Benjamin J.; Upfal, Eli

    2015-01-01

    A key challenge in genomics is to identify genetic variants that distinguish patients with different survival time following diagnosis or treatment. While the log-rank test is widely used for this purpose, nearly all implementations of the log-rank test rely on an asymptotic approximation that is not appropriate in many genomics applications. This is because: the two populations determined by a genetic variant may have very different sizes; and the evaluation of many possible variants demands highly accurate computation of very small p-values. We demonstrate this problem for cancer genomics data where the standard log-rank test leads to many false positive associations between somatic mutations and survival time. We develop and analyze a novel algorithm, Exact Log-rank Test (ExaLT), that accurately computes the p-value of the log-rank statistic under an exact distribution that is appropriate for any size populations. We demonstrate the advantages of ExaLT on data from published cancer genomics studies, finding significant differences from the reported p-values. We analyze somatic mutations in six cancer types from The Cancer Genome Atlas (TCGA), finding mutations with known association to survival as well as several novel associations. In contrast, standard implementations of the log-rank test report dozens-hundreds of likely false positive associations as more significant than these known associations. PMID:25950620

  9. Speech Enhancement, Gain, and Noise Spectrum Adaptation Using Approximate Bayesian Estimation

    PubMed Central

    Hao, Jiucang; Attias, Hagai; Nagarajan, Srikantan; Lee, Te-Won; Sejnowski, Terrence J.

    2010-01-01

    This paper presents a new approximate Bayesian estimator for enhancing a noisy speech signal. The speech model is assumed to be a Gaussian mixture model (GMM) in the log-spectral domain. This is in contrast to most current models in frequency domain. Exact signal estimation is a computationally intractable problem. We derive three approximations to enhance the efficiency of signal estimation. The Gaussian approximation transforms the log-spectral domain GMM into the frequency domain using minimal Kullback–Leiber (KL)-divergency criterion. The frequency domain Laplace method computes the maximum a posteriori (MAP) estimator for the spectral amplitude. Correspondingly, the log-spectral domain Laplace method computes the MAP estimator for the log-spectral amplitude. Further, the gain and noise spectrum adaptation are implemented using the expectation–maximization (EM) algorithm within the GMM under Gaussian approximation. The proposed algorithms are evaluated by applying them to enhance the speeches corrupted by the speech-shaped noise (SSN). The experimental results demonstrate that the proposed algorithms offer improved signal-to-noise ratio, lower word recognition error rate, and less spectral distortion. PMID:20428253

  10. Consumption of lead-shot cervid meat and blood lead concentrations in a group of adult Norwegians.

    PubMed

    Meltzer, H M; Dahl, H; Brantsæter, A L; Birgisdottir, B E; Knutsen, H K; Bernhoft, A; Oftedal, B; Lande, U S; Alexander, J; Haugen, M; Ydersbond, T A

    2013-11-01

    Several recent investigations have reported high concentrations of lead in samples of minced cervid meat. This paper describes findings from a Norwegian study performed in 2012 among 147 adults with a wide range of cervid game consumption. The main aim was to assess whether high consumption of lead-shot cervid meat is associated with increased concentration of lead in blood. A second aim was to investigate to what extent factors apart from game consumption explain observed variability in blood lead levels. Median (5 and 95 percentile) blood concentration of lead was 16.6 µg/L (7.5 and 39 µg/L). An optimal multivariate linear regression model for log-transformed blood lead indicated that cervid game meat consumption once a month or more was associated with approximately 31% increase in blood lead concentrations. The increase seemed to be mostly associated with consumption of minced cervid meat, particularly purchased minced meat. However, many participants with high and long-lasting game meat intake had low blood lead concentrations. Cervid meat together with number of bullet shots per year, years with game consumption, self-assembly of bullets, wine consumption and smoking jointly accounted for approximately 25% of the variation in blood lead concentrations, while age and sex accounted for 27% of the variance. Blood lead concentrations increased approximately 18% per decade of age, and men had on average 30% higher blood lead concentrations than women. Hunters who assembled their own ammunition had 52% higher blood lead concentrations than persons not making ammunition. In conjunction with minced cervid meat, wine intake was significantly associated with increased blood lead. Our results indicate that hunting practices such as use of lead-based ammunition, self-assembling of lead containing bullets and inclusion of lead-contaminated meat for mincing to a large extent determine the exposure to lead from cervid game consumption. © 2013 Elsevier Inc. All rights reserved.

  11. Novel bayes factors that capture expert uncertainty in prior density specification in genetic association studies.

    PubMed

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2015-05-01

    Bayes factors (BFs) are becoming increasingly important tools in genetic association studies, partly because they provide a natural framework for including prior information. The Wakefield BF (WBF) approximation is easy to calculate and assumes a normal prior on the log odds ratio (logOR) with a mean of zero. However, the prior variance (W) must be specified. Because of the potentially high sensitivity of the WBF to the choice of W, we propose several new BF approximations with logOR ∼N(0,W), but allow W to take a probability distribution rather than a fixed value. We provide several prior distributions for W which lead to BFs that can be calculated easily in freely available software packages. These priors allow a wide range of densities for W and provide considerable flexibility. We examine some properties of the priors and BFs and show how to determine the most appropriate prior based on elicited quantiles of the prior odds ratio (OR). We show by simulation that our novel BFs have superior true-positive rates at low false-positive rates compared to those from both P-value and WBF analyses across a range of sample sizes and ORs. We give an example of utilizing our BFs to fine-map the CASP8 region using genotype data on approximately 46,000 breast cancer case and 43,000 healthy control samples from the Collaborative Oncological Gene-environment Study (COGS) Consortium, and compare the single-nucleotide polymorphism ranks to those obtained using WBFs and P-values from univariate logistic regression. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  12. Veneer-log production and receipts in the Southeast, 1988

    Treesearch

    Cecil C. Hutchins

    1990-01-01

    In 1988, almost 1.4 billion board feet of veneer logs were harvested in the Southeast, and the region's veneer mills processed approximately 1.5 billion board feet of logs. Almost 78 percent of veneer-log production and 76 percent of veneer-log receipts were softwood. There were 79 veneer mills operating in 1988. Softwood plywood was the major product. Almost all...

  13. Minimax rational approximation of the Fermi-Dirac distribution.

    PubMed

    Moussa, Jonathan E

    2016-10-28

    Accurate rational approximations of the Fermi-Dirac distribution are a useful component in many numerical algorithms for electronic structure calculations. The best known approximations use O(log(βΔ)log(ϵ -1 )) poles to achieve an error tolerance ϵ at temperature β -1 over an energy interval Δ. We apply minimax approximation to reduce the number of poles by a factor of four and replace Δ with Δ occ , the occupied energy interval. This is particularly beneficial when Δ ≫ Δ occ , such as in electronic structure calculations that use a large basis set.

  14. Minimax rational approximation of the Fermi-Dirac distribution

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan E.

    2016-10-01

    Accurate rational approximations of the Fermi-Dirac distribution are a useful component in many numerical algorithms for electronic structure calculations. The best known approximations use O(log(βΔ)log(ɛ-1)) poles to achieve an error tolerance ɛ at temperature β-1 over an energy interval Δ. We apply minimax approximation to reduce the number of poles by a factor of four and replace Δ with Δocc, the occupied energy interval. This is particularly beneficial when Δ ≫ Δocc, such as in electronic structure calculations that use a large basis set.

  15. Forest canopy damage and recovery in reduced-impact and conventional selective logging in eastern Para, Brazil.

    Treesearch

    Rodrigo Pereira Jr.; Johan Zweedea; Gregory P. Asnerb; Keller; Michael

    2002-01-01

    We investigated ground and canopy damage and recovery following conventional logging and reduced-impact logging (RIL) of moist tropical forest in the eastern Amazon of Brazil. Paired conventional and RIL blocks were selectively logged with a harvest intensity of approximately 23 m3 ha

  16. Log-Concavity and Strong Log-Concavity: a review

    PubMed Central

    Saumard, Adrien; Wellner, Jon A.

    2016-01-01

    We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693

  17. Minimax rational approximation of the Fermi-Dirac distribution

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-27

    Accurate rational approximations of the Fermi-Dirac distribution are a useful component in many numerical algorithms for electronic structure calculations. The best known approximations use O(log(βΔ)log(ϵ –1)) poles to achieve an error tolerance ϵ at temperature β –1 over an energy interval Δ. We apply minimax approximation to reduce the number of poles by a factor of four and replace Δ with Δ occ, the occupied energy interval. Furthermore, this is particularly beneficial when Δ >> Δ occ, such as in electronic structure calculations that use a large basis set.

  18. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  19. Maternal blood lead level during pregnancy in South Central Los Angeles.

    PubMed

    Rothenberg, S J; Manalo, M; Jiang, J; Khan, F; Cuellar, R; Reyes, S; Sanchez, M; Reynoso, B; Aguilar, A; Diaz, M; Acosta, S; Jauregui, M; Johnson, C

    1999-01-01

    Twenty-five years of public health efforts produced a striking reduction in lead exposure; the blood lead average in the United States has decreased to less than 20% of levels measured in the 1970s. However, poor minority groups that live in large urban centers are still at high risk for elevated lead levels. In this study, our data showed that pregnant immigrants (n = 1,428) who live in South Central Los Angeles--one of the most economically depressed regions of California--have significantly higher (p < .0001) blood lead levels (geometric mean = 2.3 microg/dl [0.11 micromol/l]) than 504 pregnant nonimmigrants (geometric mean = 1.9 microg/dl [0.09 micromol/l]). The most important factors associated with lower blood lead levels in both groups were younger age; more-recent date of blood sampling (i.e., decreasing secular trend); and blood sampling in mid-autumn, instead of mid-spring (i.e., seasonal trend). Blood lead levels of immigrants were strongly dependent on time elapsed since immigration to the United States; each natural log increase in years of residence was associated with an approximately 19% decrease in blood lead levels. Although blood lead means for both groups were almost the same as the estimated national average, 25 of the 30 cases of elevated blood lead (i.e., > or = 10 microg/dl [0.48 micromol/l) occurred in the immigrant group. The odds ratio (95% confidence intervals within parentheses) for having elevated blood lead levels (a) was 9.3 (1.9, 45.8) if the immigrant engaged in pica; (b) was 3.8 (1.4, 10.5) if the immigrant had low dietary calcium intake during pregnancy; and (c) was .65 (.43, .98) for every natural log unit increase of years of residence in the United States. The control of pica and dietary calcium intake may offer a means of reducing lead exposure in immigrants.

  20. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population.

    PubMed

    Tomitaka, Shinichiro; Kawasaki, Yohei; Ide, Kazuki; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A; Ono, Yutaka

    2016-01-01

    Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D) questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items). The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an exponential mathematical pattern.

  1. Boundary curves of individual items in the distribution of total depressive symptom scores approximate an exponential pattern in a general population

    PubMed Central

    Kawasaki, Yohei; Akutagawa, Maiko; Yamada, Hiroshi; Furukawa, Toshiaki A.; Ono, Yutaka

    2016-01-01

    Background Previously, we proposed a model for ordinal scale scoring in which individual thresholds for each item constitute a distribution by each item. This lead us to hypothesize that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores follow a common mathematical model, which is expressed as the product of the frequency of the total depressive symptom scores and the probability of the cumulative distribution function of each item threshold. To verify this hypothesis, we investigated the boundary curves of the distribution of total depressive symptom scores in a general population. Methods Data collected from 21,040 subjects who had completed the Center for Epidemiologic Studies Depression Scale (CES-D) questionnaire as part of a national Japanese survey were analyzed. The CES-D consists of 20 items (16 negative items and four positive items). The boundary curves of adjacent item scores in the distribution of total depressive symptom scores for the 16 negative items were analyzed using log-normal scales and curve fitting. Results The boundary curves of adjacent item scores for a given symptom approximated a common linear pattern on a log normal scale. Curve fitting showed that an exponential fit had a markedly higher coefficient of determination than either linear or quadratic fits. With negative affect items, the gap between the total score curve and boundary curve continuously increased with increasing total depressive symptom scores on a log-normal scale, whereas the boundary curves of positive affect items, which are not considered manifest variables of the latent trait, did not exhibit such increases in this gap. Discussion The results of the present study support the hypothesis that the boundary curves of each depressive symptom score in the distribution of total depressive symptom scores commonly follow the predicted mathematical model, which was verified to approximate an exponential mathematical pattern. PMID:27761346

  2. On Nash-Equilibria of Approximation-Stable Games

    NASA Astrophysics Data System (ADS)

    Awasthi, Pranjal; Balcan, Maria-Florina; Blum, Avrim; Sheffet, Or; Vempala, Santosh

    One reason for wanting to compute an (approximate) Nash equilibrium of a game is to predict how players will play. However, if the game has multiple equilibria that are far apart, or ɛ-equilibria that are far in variation distance from the true Nash equilibrium strategies, then this prediction may not be possible even in principle. Motivated by this consideration, in this paper we define the notion of games that are approximation stable, meaning that all ɛ-approximate equilibria are contained inside a small ball of radius Δ around a true equilibrium, and investigate a number of their properties. Many natural small games such as matching pennies and rock-paper-scissors are indeed approximation stable. We show furthermore there exist 2-player n-by-n approximation-stable games in which the Nash equilibrium and all approximate equilibria have support Ω(log n). On the other hand, we show all (ɛ,Δ) approximation-stable games must have an ɛ-equilibrium of support O(Δ^{2-o(1)}/ɛ2{log n}), yielding an immediate n^{O(Δ^{2-o(1)}/ɛ^2log n)}-time algorithm, improving over the bound of [11] for games satisfying this condition. We in addition give a polynomial-time algorithm for the case that Δ and ɛ are sufficiently close together. We also consider an inverse property, namely that all non-approximate equilibria are far from some true equilibrium, and give an efficient algorithm for games satisfying that condition.

  3. Effect of curve sawing on lumber recovery and warp of short cherry logs containing sweep

    Treesearch

    Brian H. Bond; Philip Araman

    2008-01-01

    It has been estimated that approximately one-third of hardwood sawlogs have a significant amount of sweep and that 7 to nearly 40 percent of the yield is lost from logs that have greater than 1 inch of sweep. While decreased yield is important, for hardwood logs the loss of lumber value is likely more significant. A method that produced lumber while accounting for log...

  4. Lipophilicity affects the pharmacokinetics and toxicity of local anaesthetic agents administered by caudal block.

    PubMed

    Nava-Ocampo, Alejandro A; Bello-Ramírez, Angélica M

    2004-01-01

    1. Drugs administered into the epidural space by caudal block are cleared by means of a process potentially affected by the lipophilic character of the compounds. 2. In the present study, we examined the relationship between the octanol-water partition coefficient (log Poct) and the time to reach the maximum plasma drug concentration (tmax) of lignocaine, bupivacaine and ropivacaine administered by caudal block in paediatric patients. We also examined the relationship between log Poct and the toxicity of these local anaesthetic agents in experimental models. The tmax and toxicity data were obtained from the literature. 3. Ropivacaine, with a log Poct of 2.9, exhibited a tmax of 61.6 min. The tmax of lignocaine, with a log Poct of 2.4, and bupivacaine, with a log Poct of with 3.4, were approximately 50% shorter than ropivacaine. At log Poct of approximately 3.0, the toxicity of these local anaesthetic agents was substantially increased. The relationship between log Poct and the convulsive effect in dogs was similar to the relationship between log Poct and the lethal dose in sheep. 4. With local anaesthetic agents, it appears that the relationship between log Poct and drug transfer from the epidural space to the blood stream is parabolic, being the slowest rate of transference at log Poct 3.0. Toxicity, due to plasma availability of these local anaesthetic agents, seems to be increased at log Poct equal or higher than 3.0 secondary to the highest transfer from plasma into the central nervous system.

  5. Analysis of Effects of Sensor Multithreading to Generate Local System Event Timelines

    DTIC Science & Technology

    2014-03-27

    works on logs highlights the importance of logs [17, 18]. The two aforementioned works both reference the same 2009 Data Breach Investigations Report...the data breaches report on, the logs contained evidence of events leading up to 82% of those data breaches . This means that preventing 82% of the data ...report states that of the data breaches reported on, the logs contained evidence of events leading up to 66% of those data breaches . • The 2010 DBIR

  6. Veneer recovery from Douglas-fir logs.

    Treesearch

    E.H. Clarke; A.C. Knauss

    1957-01-01

    During 1956, the Pacific Northwest Forest and Range Experiment Station made a series of six veneer-recovery studies in the Douglas-fir region of Oregon and Washington. The net volume of logs involved totaled approximately 777 M board-feet. Purpose of these studies was to determine volume recovery, by grade of veneer, from the four principal grades of Douglas-fir logs...

  7. Assessing the Utility of a Daily Log for Measuring Principal Leadership Practice

    ERIC Educational Resources Information Center

    Camburn, Eric M.; Spillane, James P.; Sebastian, James

    2010-01-01

    Purpose: This study examines the feasibility and utility of a daily log for measuring principal leadership practice. Setting and Sample: The study was conducted in an urban district with approximately 50 principals. Approach: The log was assessed against two criteria: (a) Is it feasible to induce strong cooperation and high response rates among…

  8. Logging truck noise near nesting northern goshawks

    Treesearch

    Teryl G. Grubb; Larry L. Pater; David K. Delaney

    1998-01-01

    We measured noise levels of four logging trucks as the trucks passed within approximately 500 m of two active northern goshawk (Accipiter gentilis) nests on the Kaibab Plateau in northern Arizona in 1997. Neither a brooding adult female nor a lone juvenile exhibited any discernable behavioral response to logging truck noise, which peaked at 53.4 and...

  9. Hardwood Veneer Timber Volume In Upper Michigan

    Treesearch

    E.W. Fobes; Gary R. Lindell

    1969-01-01

    Forests in Upper Michigan contain approximately 1.5 billion board feet of veneer logs of which three-fourths is hard maple and yellow birch. About 14 percent of the hardwood sawtimber is suitable for veneer logs.

  10. Minimum nonuniform graph partitioning with unrelated weights

    NASA Astrophysics Data System (ADS)

    Makarychev, K. S.; Makarychev, Yu S.

    2017-12-01

    We give a bi-criteria approximation algorithm for the Minimum Nonuniform Graph Partitioning problem, recently introduced by Krauthgamer, Naor, Schwartz and Talwar. In this problem, we are given a graph G=(V,E) and k numbers ρ_1,\\dots, ρ_k. The goal is to partition V into k disjoint sets (bins) P_1,\\dots, P_k satisfying \\vert P_i\\vert≤ ρi \\vert V\\vert for all i, so as to minimize the number of edges cut by the partition. Our bi-criteria algorithm gives an O(\\sqrt{log \\vert V\\vert log k}) approximation for the objective function in general graphs and an O(1) approximation in graphs excluding a fixed minor. The approximate solution satisfies the relaxed capacity constraints \\vert P_i\\vert ≤ (5+ \\varepsilon)ρi \\vert V\\vert. This algorithm is an improvement upon the O(log \\vert V\\vert)-approximation algorithm by Krauthgamer, Naor, Schwartz and Talwar. We extend our results to the case of 'unrelated weights' and to the case of 'unrelated d-dimensional weights'. A preliminary version of this work was presented at the 41st International Colloquium on Automata, Languages and Programming (ICALP 2014). Bibliography: 7 titles.

  11. The yield of Douglas-fir in the Pacific Northwest measured by international 1/4-inch kerf log rule.

    Treesearch

    Philip A. Briegleb

    1948-01-01

    The international log rule is little used in the Douglas-fir region today, but it is likely to find wider use here in the future. This is the opinion of a number of foresters preparing plans for the management of forest properties in the region. Advantage of the International 1/4-inch kerf log rule is that log scale by this measure approximates the volume of green...

  12. Role of photoacoustics in optogalvanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, D.; McGlynn, S.P.

    1990-09-15

    Time-resolved laser optogalvanic (LOG) signals have been induced by pulsed laser excitation (l{ital s}{sub {ital j}}{r arrow}2{ital p}{sub {ital k}}, Paschen notation) of a {approximately}30 MHz radio-frequency (rf) discharge in neon at {approximately}5 torr. Dramatic changes of the shape/polarity of certain parts of the LOG signals occur when the rf excitation frequency is scanned over the electrical resonance peak of the plasma and the associated driving/detecting circuits. These effects are attributed to ionization rate changes (i.e., laser-induced alterations of the plasma conductivity), with concomitant variations in the plasma resonance characteristics. In addition to ionization rate changes, it is shown thatmore » photoacoustic (PA) effects also play a significant role in the generation of the LOG signal. Those parts of the LOG signal that are invariant with respect to the rf frequency are attributed to a PA effect. The similarity of LOG signal shapes from both rf and dc discharges suggests that photoacoustics play a similar role in the LOG effect in dc discharges. Contrary to common belief, most reported LOG signal profiles, ones produced by excitation to levels that do not lie close to the ionization threshold, appear to be totally mediated by the PA effect.« less

  13. LBA-ECO TG-07 Trace Gas Fluxes, Undisturbed and Logged Sites, Para, Brazil: 2000-2002

    Treesearch

    M.M. Keller; R.K. Varner; J.D. Dias; H.S. Silva; P.M. Crill; Jr. de Oliveira; G.P. Asner

    2009-01-01

    Trace gas fluxes of carbon dioxide, methane, nitrous oxide, and nitric oxide were measured manually at undisturbed and logged forest sites in the Tapajos National Forest, near Santarem, Para, Brazil. Manual measurements were made approximately weekly at both the undisturbed and logged sites. Fluxes from clay and sand soils were completed at the undisturbed sites....

  14. A New Closed Form Approximation for BER for Optical Wireless Systems in Weak Atmospheric Turbulence

    NASA Astrophysics Data System (ADS)

    Kaushik, Rahul; Khandelwal, Vineet; Jain, R. C.

    2018-04-01

    Weak atmospheric turbulence condition in an optical wireless communication (OWC) is captured by log-normal distribution. The analytical evaluation of average bit error rate (BER) of an OWC system under weak turbulence is intractable as it involves the statistical averaging of Gaussian Q-function over log-normal distribution. In this paper, a simple closed form approximation for BER of OWC system under weak turbulence is given. Computation of BER for various modulation schemes is carried out using proposed expression. The results obtained using proposed expression compare favorably with those obtained using Gauss-Hermite quadrature approximation and Monte Carlo Simulations.

  15. Probing the galactic disk and halo. 2: Hot interstellar gas toward the inner galaxy star HD 156359

    NASA Technical Reports Server (NTRS)

    Sembach, Kenneth R.; Savage, Blair D.; Lu, Limin

    1995-01-01

    We present Goddard High Resolution Spectrograph intermediate-resolution measurements of the 1233-1256 A spectral region of HD 156396, a halo star at l = 328.7 deg, b = -14.5 deg in the inner Galaxy with a line-of sight distance of 11.1 kpc and a z-distance of -2.8 kpc. The data have a resolution of 18 km/s Full Width at Half Maximum (FWHM) and a signal-to-noise ratio of approximately 50:1. We detect interstellar lines of Mg II, S II, S II, Ge II, and N V and determine log N/(Mg II) = 15.78 +0.25, -0.27, log N(Si II) greater than 13.70, log N(S II) greater than 15.76, log N(Ge II) = 12.20 +0.09,-0.11, and log N(N v) = 14.06 +/- 0.02. Assuming solar reference abundances, the diffuse clouds containing Mg, S, and Ge along the sight line have average logarithmic depletions D(Mg) = -0.6 +/- 0.3 dex, D(S) greater than -0.2 dex, and D(Ge) = -0.2 +/- 0.2 dex. The Mg and Ge depletions are approximately 2 times smaller than is typical of diffuse clouds in the solar vicinity. Galactic rotational modeling of the N v profiles indicates that the highly ionized gas traced by this ion has a scale height of approximately 1 kpc if gas at large z-distances corotates with the underlying disk gas. Rotational modeling of the Si iv and C iv profiles measured by the IUE satellite yields similar scale height estimates. The scale height results contrast with previous studies of highly ionized gas in the outer Milky Way that reveal a more extended gas distribtion with h approximately equals 3-4 kpc. We detect a high-velocity feature in N v and Si II v(sub LSR) approximately equals + 125 km/s) that is probably created in an interface between warm and hot gas.

  16. Bioaccumulation and enantioselectivity of type I and type II pyrethroid pesticides in earthworm.

    PubMed

    Chang, Jing; Wang, Yinghuan; Wang, Huili; Li, Jianzhong; Xu, Peng

    2016-02-01

    In this study, the bioavailability and enantioselectivity differences between bifenthrin (BF, typeⅠpyrethroid) and lambad-cyhalothrin (LCT, type Ⅱ pyrethroid) in earthworm (Eisenia fetida) were investigated. The bio-soil accumulation factors (BSAFs) of BF was about 4 times greater than that of LCT. LCT was degraded faster than BF in soil while eliminated lower in earthworm samples. Compound sorption plays an important role on bioavailability in earthworm, and the soil-adsorption coefficient (K(oc)) of BF and LCT were 22 442 and 42 578, respectively. Metabolic capacity of earthworm to LCT was further studied as no significant difference in the accumulation of LCT between the high and low dose experiment was found. 3-phenoxybenzoic acid (PBCOOH), a metabolite of LCT produced by earthworm was detected in soil. The concentration of PBCOOH at high dose exposure was about 4.7 times greater than that of in low dose level at the fifth day. The bioaccumulation of BF and LCT were both enantioselective in earthworm. The enantiomer factors of BF and LCT in earthworm were approximately 0.12 and 0.65, respectively. The more toxic enantiomers ((+)-BF and (-)-LCT) had a preferential degradation in earthworm and leaded to less toxicity on earthworm for racemate exposure. In combination with other studies, a liner relationship between Log BSAF(S) and Log K(ow) was observed, and the Log BSAF(S) decreased with the increase of Log K(ow). Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A multidisciplinary approach to reservoir subdivision of the Maastrichtian chalk in the Dan field, Danish North Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kristensen, L.; Dons, T.; Schioler, P.

    1995-11-01

    Correlation of wireline log data from the North Sea chalk reservoirs is frequently hampered by rather subtle log patterns in the chalk section due to the apparent monotonous nature of the chalk sediments, which may lead to ambiguous correlations. This study deals with a correlation technique based on an integration of biostratigraphic data, seismic interpretation, and wireline log correlation; this technique aims at producing a consistent reservoir subdivision that honors both the well data and the seismic data. This multidisciplinary approach has been used to subdivide and correlate the Maastrichtian chalk in the Dan field. The biostratigraphic subdivision is basedmore » on a new detailed dinoflagellate study of core samples from eight wells. Integrating the biostratigraphic results with three-dimensional seismic data allows recognition of four stratigraphic units within the Maastrichtian, bounded by assumed chronostratigraphic horizons. This subdivision is further refined by adding a seismic horizon and four horizons from wireline log correlations, establishing a total of nine reservoir units. The approximate chronostratigraphic nature of these units provides an improved interpretation of the depositional and structural patterns in this area. The three upper reservoir units pinch out and disappear in a northeasterly direction across the field. We interpret this stratal pattern as reflecting a relative sea level fall or regional basinal subsidence during the latest Maastrichtian, possibly combined with local synsedimentary uplift due to salt tectonics. Isochore maps indicate that the underlying six non-wedging units are unaffected by salt tectonics.« less

  18. Speech Enhancement Using Gaussian Scale Mixture Models

    PubMed Central

    Hao, Jiucang; Lee, Te-Won; Sejnowski, Terrence J.

    2011-01-01

    This paper presents a novel probabilistic approach to speech enhancement. Instead of a deterministic logarithmic relationship, we assume a probabilistic relationship between the frequency coefficients and the log-spectra. The speech model in the log-spectral domain is a Gaussian mixture model (GMM). The frequency coefficients obey a zero-mean Gaussian whose covariance equals to the exponential of the log-spectra. This results in a Gaussian scale mixture model (GSMM) for the speech signal in the frequency domain, since the log-spectra can be regarded as scaling factors. The probabilistic relation between frequency coefficients and log-spectra allows these to be treated as two random variables, both to be estimated from the noisy signals. Expectation-maximization (EM) was used to train the GSMM and Bayesian inference was used to compute the posterior signal distribution. Because exact inference of this full probabilistic model is computationally intractable, we developed two approaches to enhance the efficiency: the Laplace method and a variational approximation. The proposed methods were applied to enhance speech corrupted by Gaussian noise and speech-shaped noise (SSN). For both approximations, signals reconstructed from the estimated frequency coefficients provided higher signal-to-noise ratio (SNR) and those reconstructed from the estimated log-spectra produced lower word recognition error rate because the log-spectra fit the inputs to the recognizer better. Our algorithms effectively reduced the SSN, which algorithms based on spectral analysis were not able to suppress. PMID:21359139

  19. Optimal Search for Moving Targets in Continuous Time and Space using Consistent Approximations

    DTIC Science & Technology

    2011-09-01

    σnNµ )q/ν M q = κ b −n log c̄ log b +B1B2 + K( b σnNµ )q/ν ( b σnNµ )q/ν M q = B2 [ κ B2b −n log c̄ log b +B1 + ( b σnNµ )q/ν B2M q Kσq/νnq/νNµq/ν bq/ν...IV.52) Since −nb log c̄ log b → a1 ∈ [ pq µq + νp ,∞), 106 the expression κ B2b −n log c̄ log b (IV.53) goes to 0, as b→∞. In addition, we know...Burden, R. (1993). Numerical methods. Boston, Massachusetts: PWS Publishing Company . Ghabcheloo, R., Kaminer, I., Aguiar, A., & Pascoal, A. (2009). A

  20. Laser scanning measurements on trees for logging harvesting operations.

    PubMed

    Zheng, Yili; Liu, Jinhao; Wang, Dian; Yang, Ruixi

    2012-01-01

    Logging harvesters represent a set of high-performance modern forestry machinery, which can finish a series of continuous operations such as felling, delimbing, peeling, bucking and so forth with human intervention. It is found by experiment that during the process of the alignment of the harvesting head to capture the trunk, the operator needs a lot of observation, judgment and repeated operations, which lead to the time and fuel losses. In order to improve the operation efficiency and reduce the operating costs, the point clouds for standing trees are collected with a low-cost 2D laser scanner. A cluster extracting algorithm and filtering algorithm are used to classify each trunk from the point cloud. On the assumption that every cross section of the target trunk is approximate a standard circle and combining the information of an Attitude and Heading Reference System, the radii and center locations of the trunks in the scanning range are calculated by the Fletcher-Reeves conjugate gradient algorithm. The method is validated through experiments in an aspen forest, and the optimized calculation time consumption is compared with the previous work of other researchers. Moreover, the implementation of the calculation result for automotive capturing trunks by the harvesting head during the logging operation is discussed in particular.

  1. Leading logarithmic corrections to the muonium hyperfine splitting and to the hydrogen Lamb shift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karshenboim, S.G.

    1994-12-31

    Main leading corrections with recoil logarithm log(M/m) and low-energy logarithm log(Za) to the Muonium hyperfine splitting axe discussed. Logarithmic corrections have magnitudes of 0.1 {divided_by} 0.3 kHz. Non-leading higher order corrections axe expected to be not larger than 0.1 kHz. Leading logarithmic correction to the Hydrogen Lamb shift is also obtained.

  2. Transfer of surface-dried Listeria monocytogenes from stainless steel knife blades to roast turkey breast.

    PubMed

    Keskinen, Lindsey A; Todd, Ewen C D; Ryser, Elliot T

    2008-01-01

    Listeria contamination of food contact surfaces can lead to cross-contamination of ready-to-eat foods in delicatessens. Recognizing that variations in Listeria biofilm-forming ability exist, the goal of this study was to determine whether these differences in biofilm formation would affect the Listeria transfer rate during slicing of delicatessen turkey meat. In this study, six previously identified strong and weak biofilm-forming strains of Listeria monocytogenes were grown at 22 degrees C for 48 h on Trypticase soy agar containing 0.6% yeast extract and harvested in 0.1% peptone. Thereafter, the strains were combined to obtain two 3-strain cocktails, resuspended in turkey slurry, and inoculated onto flame-sterilized AISI grade 304 stainless steel knife blades that were subjected to 6 and 24 h of ambient storage at approximately 78% relative humidity. After mounting on an Instron Universal Testing Machine, these blades were used to obtain 16 slices of retail roast turkey breast. Based on an analysis of the slices by direct plating, Listeria populations decreased 3 to 5 log CFU per slice after 16 slices. Overall, total transfer to turkey was significantly greater for strong (4.4 log CFU total) as opposed to weak (3.5 log CFU total; P < 0.05) biofilm formers. In addition, significantly more cells were transferred at 6 (4.6 log CFU total) than at 24 h (3.3 log CFU total; P < 0.05) with Listeria quantifiable to the 16th slice, regardless of the inoculation level. Increased survival by the strong biofilm formers, as evidenced by viability staining, suggests that these strains are better adapted to survive stressful conditions than their weak biofilm-forming counterparts.

  3. Selective logging in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N

    2005-10-21

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.

  4. Tanana River Monitoring and Research Program: Relationships Among Bank Recession, Vegetation, Soils, Sediments and Permafrost on the Tanana River Near Fairbanks, Alaska.

    DTIC Science & Technology

    1984-07-01

    field book for scale). Figure 2 (cont’d). Figure 3. Upstream portion of reach 2, 9 May 1980; USGS gauging station (A) and the approximate location...eral information was taken from maps, and site-specific data were obtained from the logs of wells drilled by the Corps of Engineers. The well log data...were drilled along or near this route, which runs approximately parallel to the bank, but not near the riverbank aL most locations (Fig. 1). The

  5. The Approximability of Learning and Constraint Satisfaction Problems

    DTIC Science & Technology

    2010-10-07

    further improved this result to NP ⊆ naPCP1,3/4+²(O(log(n)),3). Around the same time, Zwick [141] showed that naPCP1,5/8(O(log(n)),3)⊆ BPP by giving a...randomized polynomial-time 5/8-approximation algorithm for satisfiable 3CSP. Therefore unless NP⊆ BPP , the best s must be bigger than 5/8. Zwick... BPP [141]. We think that Question 5.1.2 addresses an important missing part in understanding the 3-query PCP systems. In addition, as is mentioned the

  6. Testing the dose-response specification in epidemiology: public health and policy consequences for lead.

    PubMed

    Rothenberg, Stephen J; Rothenberg, Jesse C

    2005-09-01

    Statistical evaluation of the dose-response function in lead epidemiology is rarely attempted. Economic evaluation of health benefits of lead reduction usually assumes a linear dose-response function, regardless of the outcome measure used. We reanalyzed a previously published study, an international pooled data set combining data from seven prospective lead studies examining contemporaneous blood lead effect on IQ (intelligence quotient) of 7-year-old children (n = 1,333). We constructed alternative linear multiple regression models with linear blood lead terms (linear-linear dose response) and natural-log-transformed blood lead terms (log-linear dose response). We tested the two lead specifications for nonlinearity in the models, compared the two lead specifications for significantly better fit to the data, and examined the effects of possible residual confounding on the functional form of the dose-response relationship. We found that a log-linear lead-IQ relationship was a significantly better fit than was a linear-linear relationship for IQ (p = 0.009), with little evidence of residual confounding of included model variables. We substituted the log-linear lead-IQ effect in a previously published health benefits model and found that the economic savings due to U.S. population lead decrease between 1976 and 1999 (from 17.1 microg/dL to 2.0 microg/dL) was 2.2 times (319 billion dollars) that calculated using a linear-linear dose-response function (149 billion dollars). The Centers for Disease Control and Prevention action limit of 10 microg/dL for children fails to protect against most damage and economic cost attributable to lead exposure.

  7. Phage Inactivation of Listeria monocytogenes on San Daniele Dry-Cured Ham and Elimination of Biofilms from Equipment and Working Environments

    PubMed Central

    Iacumin, Lucilla; Manzano, Marisa; Comi, Giuseppe

    2016-01-01

    The anti-listerial activity of generally recognized as safe (GRAS) bacteriophage Listex P100 (phage P100) was demonstrated in broths and on the surface of slices of dry-cured ham against 5 strains or serotypes (i.e., Scott A, 1/2a, 1/2b, and 4b) of Listeria monocytogenes. In a broth model system, phage P100 at a concentration equal to or greater than 7 log PFU/mL completely inhibited 2 log CFU/cm2 or 3 log CFU/cm2 of L. monocytogenes growth at 30 °C. The temperature (4, 10, 20 °C) seemed to influence P100 activity; the best results were obtained at 4 °C. On dry-cured ham slices, a P100 concentration ranging from 5 to 8 log PFU/cm2 was required to obtain a significant reduction in L. monocytogenes. At 4, 10, and 20 °C, an inoculum of 8 log PFU/cm2 was required to completely eliminate 2 log L. monocytogenes/cm2 and to reach the absence in 25 g product according to USA food law. Conversely, it was impossible to completely eradicate L. monocytogenes with an inoculum of approximately of 3.0 and 4.0 log CFU/cm2 and with a P100 inoculum ranging from 1 to 7 log PFU/cm2. P100 remained stable on dry-cured ham slices over a 14-day storage period, with only a marginal loss of 0.2 log PFU/cm2 from an initial phage treatment of approximately 8 log PFU/cm2. Moreover, phage P100 eliminated free L. monocytogenes cells and biofilms on the machinery surfaces used for dry-cured ham production. These findings demonstrate that the GRAS bacteriophage Listex P100 at level of 8 log PFU/cm2 is listericidal and useful for reducing the L. monocytogenes concentration or eradicating the bacteria from dry-cured ham. PMID:27681898

  8. Detection of contaminant plumes by bore­ hole geophysical logging

    USGS Publications Warehouse

    Mack, Thomas J.

    1993-01-01

    Two borehole geophysical methods—electromagnetic induction and natural gamma radiation logs—were used to vertically delineate landfill leachate plumes in a glacial aquifer. Geophysical logs of monitoring wells near two land-fills in a glacial aquifer in west-central Vermont show that borehole geophysical methods can aid in interpretation of geologic logs and placement of monitoring well screens to sample landfill leachate plumes.Zones of high electrical conductance were delineated from the electromagnetic log in wells near two landfills. Some of these zones were found to correlate with silt and clay units on the basis of drilling and gamma logs. Monitoring wells were screened specifically in zones of high electrical conductivity that did not correlate to a silt or clay unit. Zones of high electrical conductivity that did not correlate to a silt or clay unit were caused by the presence of ground water with a high specific conductance, generally from 1000 to 2370 μS/cm (microsiemens per centimeter at 25 degrees Celsius). Ambient ground water in the study area has a specific conductance of approximately 200 to 400 μS/cm. Landfill leachate plumes were found to be approximately 5 to 20 feet thick and to be near the water table surface.

  9. Evaluation of the use of salivary lead levels as a surrogate of blood lead or plasma lead levels in lead exposed subjects.

    PubMed

    Barbosa, Fernando; Corrêa Rodrigues, Maria Heloísa; Buzalaf, Maria R; Krug, Francisco J; Gerlach, Raquel F; Tanus-Santos, José Eduardo

    2006-10-01

    We conducted a study to evaluate the use of parotid salivary lead (Pb-saliva) levels as a surrogate of the blood lead (Pb-B) or plasma lead levels (Pb-P) to diagnose lead exposure. The relationship between these biomarkers was assessed in a lead exposed population. Pb-saliva and Pb-P were determined by inductively coupled plasma mass spectrometry, while in whole blood lead was determined by graphite furnace atomic absorption spectrometry. We studied 88 adults (31 men and 57 women) from 18 to 60 years old. Pb-saliva levels varied from 0.05 to 4.4 microg/l, with a mean of 0.85 microg/l. Blood lead levels varied from 32.0 to 428.0 microg/l in men (mean 112.3 microg/l) and from 25.0 to 263.0 microg/l (mean 63.5 microg/l) in women. Corresponding Pb-Ps were 0.02-2.50 microg/l (mean 0.77 microg/l) and 0.03-1.6 microg/l (mean 0.42 microg/l) in men and women, respectively. A weak correlation was found between Log Pb-saliva and Log Pb-B (r=0.277, P<0.008), and between Log Pb-saliva and Log Pb-P (r=0.280, P=0.006). The Pb-saliva/Pb-P ratio ranged from 0.20 to 18.0. Age or gender does not affect Pb-saliva levels or Pb-saliva/Pb-P ratio. Taken together, these results suggest that salivary lead may not be used as a biomarker to diagnose lead exposure nor as a surrogate of plasma lead levels at least for low to moderately lead exposed population.

  10. Challenges in converting among log scaling methods.

    Treesearch

    Henry Spelter

    2003-01-01

    The traditional method of measuring log volume in North America is the board foot log scale, which uses simple assumptions about how much of a log's volume is recoverable. This underestimates the true recovery potential and leads to difficulties in comparing volumes measured with the traditional board foot system and those measured with the cubic scaling systems...

  11. An Unusual Log-splitter Injury Leading to Radial Artery Thrombosis, Ulnar Artery Laceration, and Scapholunate Dissociation

    PubMed Central

    Spock, Christopher R.; Salomon, Jeffrey C.; Narayan, Deepak

    2008-01-01

    A log splitter is a gasoline- or diesel-powered machine that uses a hydraulic-powered cutting wedge to do the work of an axe. Log-splitter injuries that do not result in amputation of digits or limbs are uncommon and not well described in the literature. We present a unique case of a patient who sustained a log-splitter injury that resulted in thrombosis of the radial artery and avulsion laceration of the ulnar artery leading to acute hand ischemia, in addition to scapholunate ligament disruption leading to a DISI deformity. In this case, thrombolytic therapy was contraindicated and surgical revascularization was the best possible treatment option. Our case illustrates the pitfalls of using this modality in a crush injury, since the use of thrombolytics in this instance would have resulted in severe hemorrhage. An important clinical caveat is the potentially misleading arteriographic diagnosis of thrombosis and/or spasm. PMID:18827886

  12. The Effect of Temperature on the Survival of Microorganisms in a Deep Space Vacuum

    NASA Technical Reports Server (NTRS)

    Hagen, C. A.; Godfrey, J. F.; Green, R. H.

    1971-01-01

    A space molecular sink research facility (Molsink) was used to evaluate the ability of microorganisms to survive the vacuum of outer space. This facility could be programmed to simulate flight spacecraft vacuum environments at pressures in the .1 nanotorr range and thermal gradients (30 to 60 C) closely associated to surface temperatures of inflight spacecraft. Initial populations of Staphylococcus epidermidis and a Micrococcus sp. were reduced approximately 1 log while exposed to -105 and 34 C, and approximately 2 logs while exposed to 59 C for 14 days in the vacuum environment. Spores of Bacillus subtilis var. niger were less affected by the environment. Initial spore populations were reduced 0.2, 0.3, and 0.8 log during the 14-day vacuum exposure at -124, 34, and 59 C, respectively.

  13. Feeding and Feedback in the Powerful Radio Galaxy 3C 120

    NASA Technical Reports Server (NTRS)

    Tombesi, F.; Mushotzky, R. F.; Reynolds, C. S.; Kallman, T.; Reeves, J. N.; Braito, V.; Ueda, Y.; Leutenegger, M. A.; Williams, B. J.; Stawarz, L.; hide

    2017-01-01

    We present a spectral analysis of a 200-kilosecond observation of the broad-line radio galaxy 3C 120, performed with the high-energy transmission grating spectrometer on board the Chandra X-Ray Observatory. We find (i) a neutral absorption component intrinsic to the source with a column density of log N (sub H) equals 20.67 plus or minus 0.05 square centimeters; (ii) no evidence for a warm absorber (WA) with an upper limit on the column density of just log N (sub H) less than 19.7 square centimeters, assuming the typical ionization parameter log xi approximately equal to 2.5 ergs per second per centimeter; the WA may instead be replaced by (iii) a hot emitting gas with a temperature kT approximately equal to 0.7 kiloelectronvolts observed as soft X-ray emission from ionized Fe L-shell lines, which may originate from a kiloparsec-scale shocked bubble inflated by the active galactic nucleus (AGN) wind or jet with a shock velocity of about 1000 kilometers per second determined by the emission line width; (iv) a neutral Fe K alpha line and accompanying emission lines indicative of a Compton-thick cold reflector with a low reflection fraction R approximately equal to 0.2, suggesting a large opening angle of the torus; (v) a highly ionized Fe XXV emission feature indicative of photoionized gas with an ionization parameter log xi equal to 3.75 (sup plus 0.38) (sub minus 0.27) ergs per second per centimeter and a column density of log N (sub H) greater than 22 square centimeters localized within approximately 2 pc from the X-ray source; and (vi) possible signatures of a highly ionized disk wind. Together with previous evidence for intense molecular line emission, these results indicate that 3C 120 is likely a late-state merger undergoing strong AGN feedback.

  14. Top down electroweak dipole operators

    NASA Astrophysics Data System (ADS)

    Fuyuto, Kaori; Ramsey-Musolf, Michael

    2018-06-01

    We derive present constraints on, and prospective sensitivity to, the electric dipole moment (EDM) of the top quark (dt) implied by searches for the EDMs of the electron and nucleons. Above the electroweak scale v, the dt arises from two gauge invariant operators generated at a scale Λ ≫ v that also mix with the light fermion EDMs under renormalization group evolution at two-loop order. Bounds on the EDMs of first generation fermion systems thus imply bounds on |dt |. Working in the leading log-squared approximation, we find that the present upper bound on |dt | is 10-19 e cm for Λ = 1 TeV, except in regions of finely tuned cancellations that allow for |dt | to be up to fifty times larger. Future de and dn probes may yield an order of magnitude increase in dt sensitivity, while inclusion of a prospective proton EDM search may lead to an additional increase in reach.

  15. Isotropic, anisotropic, and borehole washout analyses in Gulf of Mexico Gas Hydrate Joint Industry Project Leg II, Alaminos Canyon well 21-A

    USGS Publications Warehouse

    Lee, Myung W.

    2012-01-01

    Through the use of three-dimensional seismic amplitude mapping, several gas hydrate prospects were identified in the Alaminos Canyon area of the Gulf of Mexico. Two of the prospects were drilled as part of the Gulf of Mexico Gas Hydrate Joint Industry Program Leg II in May 2009, and a suite of logging-while-drilling logs was acquired at each well site. Logging-while-drilling logs at the Alaminos Canyon 21–A site indicate that resistivities of approximately 2 ohm-meter and P-wave velocities of approximately 1.9 kilometers per second were measured in a possible gas-hydrate-bearing target sand interval between 540 and 632 feet below the sea floor. These values are slightly elevated relative to those measured in the hydrate-free sediment surrounding the sands. The initial well log analysis is inconclusive in determining the presence of gas hydrate in the logged sand interval, mainly because large washouts in the target interval degraded well log measurements. To assess gas-hydrate saturations, a method of compensating for the effect of washouts on the resistivity and acoustic velocities is required. To meet this need, a method is presented that models the washed-out portion of the borehole as a vertical layer filled with seawater (drilling fluid). Owing to the anisotropic nature of this geometry, the apparent anisotropic resistivities and velocities caused by the vertical layer are used to correct measured log values. By incorporating the conventional marine seismic data into the well log analysis of the washout-corrected well logs, the gas-hydrate saturation at well site AC21–A was estimated to be in the range of 13 percent. Because gas hydrates in the vertical fractures were observed, anisotropic rock physics models were also applied to estimate gas-hydrate saturations.

  16. Lead dust in Broken Hill homes: effect of remediation on indoor lead levels.

    PubMed

    Boreland, F; Lyle, D M

    2006-02-01

    This study was undertaken to determine whether home remediation effectively reduced indoor lead levels in Broken Hill, a long-established silver-lead-zinc mining town in outback Australia. A before-after study of the effect of home remediation on indoor lead levels was embedded into a randomized controlled trial of the effectiveness of remediation for reducing elevated blood lead levels in young children. Moist towelettes were used to measure lead loading (microg/m2) on internal windowsills and internal and entry floors of 98 homes; samples were collected before, immediately after, and 2, 4, 6, 8, and 10 months after remediation. Data were log(10) transformed for the analysis. Remediation reduced average indoor lead levels by approximately 50%, and lead levels remained low for the duration of the follow-up period (10 months). The greatest gains were made in homes with the highest initial lead levels; homes with low preremediation lead levels showed little or no benefit. Before remediation, homes located in areas with high soil lead levels or with "poor" dust proofing had higher lead levels than those in areas with lower soil lead levels or with "medium" or "good" dust proofing; these relative differences remained after remediation. There was no evidence that lead loading was reduced by an increased opportunity to become aware of lead issues. We conclude that remediation is an effective strategy for reducing the lead exposure of children living in homes with high indoor lead levels.

  17. Effects of Milk vs Dark Chocolate Consumption on Visual Acuity and Contrast Sensitivity Within 2 Hours: A Randomized Clinical Trial.

    PubMed

    Rabin, Jeff C; Karunathilake, Nirmani; Patrizi, Korey

    2018-04-26

    Consumption of dark chocolate can improve blood flow, mood, and cognition in the short term, but little is known about the possible effects of dark chocolate on visual performance. To compare the short-term effects of consumption of dark chocolate with those of milk chocolate on visual acuity and large- and small-letter contrast sensitivity. A randomized, single-masked crossover design was used to assess short-term visual performance after consumption of a dark or a milk chocolate bar. Thirty participants without pathologic eye disease each consumed dark and milk chocolate in separate sessions, and within-participant paired comparisons were used to assess outcomes. Testing was conducted at the Rosenberg School of Optometry from June 25 to August 15, 2017. Visual acuity (in logMAR units) and large- and small-letter contrast sensitivity (in the log of the inverse of the minimum detectable contrast [logCS units]) were measured 1.75 hours after consumption of dark and milk chocolate bars. Among the 30 participants (9 men and 21 women; mean [SD] age, 26 [5] years), small-letter contrast sensitivity was significantly higher after consumption of dark chocolate (mean [SE], 1.45 [0.04] logCS) vs milk chocolate (mean [SE], 1.30 [0.05] logCS; mean improvement, 0.15 logCS [95% CI, 0.08-0.22 logCS]; P < .001). Large-letter contrast sensitivity was slightly higher after consumption of dark chocolate (mean [SE], 2.05 [0.02] logCS) vs milk chocolate (mean [SE], 2.00 [0.02] logCS; mean improvement, 0.05 logCS [95% CI, 0.00-0.10 logCS]; P = .07). Visual acuity improved slightly after consumption of dark chocolate (mean [SE], -0.22 [0.01] logMAR; visual acuity, approximately 20/12) and milk chocolate (mean [SE], -0.18 [0.01] logMAR; visual acuity, approximately 20/15; mean improvement, 0.04 logMAR [95% CI, 0.02-0.06 logMAR]; P = .05). Composite scores combining results from all tests showed significant improvement after consumption of dark compared with milk chocolate (mean improvement, 0.20 log U [95% CI, 0.10-0.30 log U]; P < .001). Contrast sensitivity and visual acuity were significantly higher 2 hours after consumption of a dark chocolate bar compared with a milk chocolate bar, but the duration of these effects and their influence in real-world performance await further testing. clinicaltrials.gov Identifier: NCT03326934.

  18. Estimating the Aqueous Solubility of Pharmaceutical Hydrates

    PubMed Central

    Franklin, Stephen J.; Younis, Usir S.; Myrdal, Paul B.

    2016-01-01

    Estimation of crystalline solute solubility is well documented throughout the literature. However, the anhydrous crystal form is typically considered with these models, which is not always the most stable crystal form in water. In this study an equation which predicts the aqueous solubility of a hydrate is presented. This research attempts to extend the utility of the ideal solubility equation by incorporating desolvation energetics of the hydrated crystal. Similar to the ideal solubility equation, which accounts for the energetics of melting, this model approximates the energy of dehydration to the entropy of vaporization for water. Aqueous solubilities, dehydration and melting temperatures, and log P values were collected experimentally and from the literature. The data set includes different hydrate types and a range of log P values. Three models are evaluated, the most accurate model approximates the entropy of dehydration (ΔSd) by the entropy of vaporization (ΔSvap) for water, and utilizes onset dehydration and melting temperatures in combination with log P. With this model, the average absolute error for the prediction of solubility of 14 compounds was 0.32 log units. PMID:27238488

  19. Application of edible coating with starch and carvacrol in minimally processed pumpkin.

    PubMed

    Santos, Adriele R; da Silva, Alex F; Amaral, Viviane C S; Ribeiro, Alessandra B; de Abreu Filho, Benicio A; Mikcha, Jane M G

    2016-04-01

    The present study evaluated the effect of an edible coating of cassava starch and carvacrol in minimally processed pumpkin (MPP). The minimal inhibitory concentration (MIC) of carvacrol against Escherichia coli, Salmonella enterica serotype Typhimurium, Aeromonas hydrophila, and Staphylococcus aureus was determined. The edible coating that contained carvacrol at the MIC and 2 × MIC was applied to MPP, and effects were evaluated with regard to the survival of experimentally inoculated bacteria and autochthonous microflora in MPP. Total titratable acidity, pH, weight loss, and soluble solids over 7 days of storage under refrigeration was also analyzed. MIC of carvacrol was 312 μg/ml. Carvacrol at the MIC reduced the counts of E. coli and S. Typhimurium by approximately 5 log CFU/g. A. hydrophila was reduced by approximately 8 log CFU/g, and S. aureus was reduced by approximately 2 log CFU/g on the seventh day of storage. Carvacrol at the 2 × MIC completely inhibited all isolates on the first day of Storage. coliforms at 35 °C and 45 °C were not detected (< 3 MPN/g) with either treatment on all days of shelf life. The treatment groups exhibited a reduction of approximately 2 log CFU/g in psychrotrophic counts compared with controls on the last day of storage. Yeast and mold were not detected with either treatment over the same period. The addition of carvacrol did not affect total titratable acidity, pH, or soluble solids and improved weight loss. The edible coating of cassava starch with carvacrol may be an interesting approach to improve the safety and microbiological quality of MPP.

  20. Efficacy of chlorine dioxide against Listeria monocytogenes in brine chilling solutions.

    PubMed

    Valderrama, W B; Mills, E W; Cutter, C N

    2009-11-01

    Chilled brine solutions are used by the food industry to rapidly cool ready-to-eat meat products after cooking and before packaging. Chlorine dioxide (ClO(2)) was investigated as an antimicrobial additive to eliminate Listeria monocytogenes. Several experiments were performed using brine solutions made of sodium chloride (NaCl) and calcium chloride (CaCl(2)) inoculated with L. monocytogenes and/or treated with 3 ppm of ClO(2). First, 10 and 20% CaCl(2) and NaCl solutions (pH 7.0) were inoculated with a five-strain cocktail of L. monocytogenes to obtain approximately 7 log CFU/ml and incubated 8 h at 0 degrees C. The results demonstrated that L. monocytogenes survived in 10% CaCl(2), 10 and 20% NaCl, and pure water. L. monocytogenes levels were reduced approximately 1.2 log CFU/ml in 20% CaCl(2). Second, inoculated ( approximately 7 log CFU/ml) brine solutions (10 and 20% NaCl and 10% CaCl(2)) treated with 3 ppm of ClO(2) resulted in a approximately 4-log reduction of the pathogen within 90 s. The same was not observed in a solution of 20% CaCl(2); further investigation demonstrated that high levels of divalent cations interfere with the disinfectant. Spent brine solutions from hot dog and ham chilling were treated with ClO(2) at concentrations of 3 or 30 ppm. At these concentrations, ClO(2) did not reduce L. monocytogenes. Removal of divalent cations and organic material in brine solutions prior to disinfection with ClO(2) should be investigated to improve the efficacy of the compound against L. monocytogenes. The information from this study may be useful to processing establishments and researchers who are investigating antimicrobials in chilling brine solutions.

  1. Behavior of shiga toxin-producing Escherichia coli, enteroinvasive E. coli, enteropathogenic E. coli and enterotoxigenic E. coli strains on whole and sliced jalapeño and serrano peppers.

    PubMed

    Gómez-Aldapa, Carlos A; Rangel-Vargas, Esmeralda; Gordillo-Martínez, Alberto J; Castro-Rosas, Javier

    2014-06-01

    The behavior of enterotoxigenic Escherichia coli (ETEC), enteropathogenic E. coli (EPEC), enteroinvasive E. coli (EIEC) and non-O157 shiga toxin-producing E. coli (non-O157-STEC) on whole and slices of jalapeño and serrano peppers as well as in blended sauce at 25 ± 2 °C and 3 ± 2 °C was investigated. Chili peppers were collected from markets of Pachuca city, Hidalgo, Mexico. On whole serrano and jalapeño stored at 25 ± 2 °C or 3 ± 2 °C, no growth was observed for EPEC, ETEC, EIEC and non-O157-STEC rifampicin resistant strains. After twelve days at 25 ± 2 °C, on serrano peppers all diarrheagenic E. coli pathotypes (DEP) strains had decreased by a total of approximately 3.7 log, whereas on jalapeño peppers the strains had decreased by approximately 2.8 log, and at 3 ± 2 °C they decreased to approximately 2.5 and 2.2 log respectively, on serrano and jalapeño. All E. coli pathotypes grew onto sliced chili peppers and in blended sauce: after 24 h at 25 ± 2 °C, all pathotypes had grown to approximately 3 and 4 log CFU on pepper slices and sauce, respectively. At 3 ± 2 °C the bacterial growth was inhibited. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Statistical distributions of ultra-low dose CT sinograms and their fundamental limits

    NASA Astrophysics Data System (ADS)

    Lee, Tzu-Cheng; Zhang, Ruoqiao; Alessio, Adam M.; Fu, Lin; De Man, Bruno; Kinahan, Paul E.

    2017-03-01

    Low dose CT imaging is typically constrained to be diagnostic. However, there are applications for even lowerdose CT imaging, including image registration across multi-frame CT images and attenuation correction for PET/CT imaging. We define this as the ultra-low-dose (ULD) CT regime where the exposure level is a factor of 10 lower than current low-dose CT technique levels. In the ULD regime it is possible to use statistically-principled image reconstruction methods that make full use of the raw data information. Since most statistical based iterative reconstruction methods are based on the assumption of that post-log noise distribution is close to Poisson or Gaussian, our goal is to understand the statistical distribution of ULD CT data with different non-positivity correction methods, and to understand when iterative reconstruction methods may be effective in producing images that are useful for image registration or attenuation correction in PET/CT imaging. We first used phantom measurement and calibrated simulation to reveal how the noise distribution deviate from normal assumption under the ULD CT flux environment. In summary, our results indicate that there are three general regimes: (1) Diagnostic CT, where post-log data are well modeled by normal distribution. (2) Lowdose CT, where normal distribution remains a reasonable approximation and statistically-principled (post-log) methods that assume a normal distribution have an advantage. (3) An ULD regime that is photon-starved and the quadratic approximation is no longer effective. For instance, a total integral density of 4.8 (ideal pi for 24 cm of water) for 120kVp, 0.5mAs of radiation source is the maximum pi value where a definitive maximum likelihood value could be found. This leads to fundamental limits in the estimation of ULD CT data when using a standard data processing stream

  3. Experimental test of postfire management in pine forests: impact of salvage logging versus partial cutting and nonintervention on bird-species assemblages.

    PubMed

    Castro, Jorge; Moreno-Rueda, Gregorio; Hódar, José A

    2010-06-01

    There is an intense debate about the effects of postfire salvage logging versus nonintervention policies on regeneration of forest communities, but scant information from experimental studies is available. We manipulated a burned forest area on a Mediterranean mountain to experimentally analyze the effect of salvage logging on bird-species abundance, diversity, and assemblage composition. We used a randomized block design with three plots of approximately 25 ha each, established along an elevational gradient in a recently burned area in Sierra Nevada Natural and National Park (southeastern Spain). Three replicates of three treatments differing in postfire burned wood management were established per plot: salvage logging, nonintervention, and an intermediate degree of intervention (felling and lopping most of the trees but leaving all the biomass). Starting 1 year after the fire, we used point sampling to monitor bird abundance in each treatment for 2 consecutive years during the breeding and winter seasons (720 censuses total). Postfire burned-wood management altered species assemblages. Salvage logged areas had species typical of open- and early-successional habitats. Bird species that inhabit forests were still present in the unsalvaged treatments even though trees were burned, but were almost absent in salvage-logged areas. Indeed, the main dispersers of mid- and late-successional shrubs and trees, such as thrushes (Turdus spp.) and the European Jay (Garrulus glandarius) were almost restricted to unsalvaged treatments. Salvage logging might thus hamper the natural regeneration of the forest through its impact on assemblages of bird species. Moreover, salvage logging reduced species abundance by 50% and richness by 40%, approximately. The highest diversity at the landscape level (gamma diversity) resulted from a combination of all treatments. Salvage logging may be positive for bird conservation if combined in a mosaic with other, less-aggressive postfire management, but stand-wide management with harvest operations has undesirable conservation effects.

  4. Morphometric analyses of hominoid crania, probabilities of conspecificity and an approximation of a biological species constant.

    PubMed

    Thackeray, J F; Dykes, S

    2016-02-01

    Thackeray has previously explored the possibility of using a morphometric approach to quantify the "amount" of variation within species and to assess probabilities of conspecificity when two fossil specimens are compared, instead of "pigeon-holing" them into discrete species. In an attempt to obtain a statistical (probabilistic) definition of a species, Thackeray has recognized an approximation of a biological species constant (T=-1.61) based on the log-transformed standard error of the coefficient m (log sem) in regression analysis of cranial and other data from pairs of specimens of conspecific extant species, associated with regression equations of the form y=mx+c where m is the slope and c is the intercept, using measurements of any specimen A (x axis), and any specimen B of the same species (y axis). The log-transformed standard error of the co-efficient m (log sem) is a measure of the degree of similarity between pairs of specimens, and in this study shows central tendency around a mean value of -1.61 and standard deviation 0.10 for modern conspecific specimens. In this paper we focus attention on the need to take into account the range of difference in log sem values (Δlog sem or "delta log sem") obtained from comparisons when specimen A (x axis) is compared to B (y axis), and secondly when specimen A (y axis) is compared to B (x axis). Thackeray's approach can be refined to focus on high probabilities of conspecificity for pairs of specimens for which log sem is less than -1.61 and for which Δlog sem is less than 0.03. We appeal for the adoption of a concept here called "sigma taxonomy" (as opposed to "alpha taxonomy"), recognizing that boundaries between species are not always well defined. Copyright © 2015 Elsevier GmbH. All rights reserved.

  5. Main controlling factors and forecasting models of lead accumulation in earthworms based on low-level lead-contaminated soils.

    PubMed

    Tang, Ronggui; Ding, Changfeng; Ma, Yibing; Wan, Mengxue; Zhang, Taolin; Wang, Xingxiang

    2018-06-02

    To explore the main controlling factors in soil and build a predictive model between the lead concentrations in earthworms (Pb earthworm ) and the soil physicochemical parameters, 13 soils with low level of lead contamination were used to conduct toxicity experiments using earthworms. The results indicated that a relatively high bioaccumulation factor appeared in the soils with low pH values. The lead concentrations between earthworms and soils after log transformation had a significantly positive correlation (R 2  = 0.46, P < 0.0001, n = 39). Stepwise multiple linear regression analysis derived a fitting empirical model between Pb earthworm and the soil physicochemical properties: log(Pb earthworm ) = 0.96log(Pb soil ) - 0.74log(OC) - 0.22pH + 0.95, (R 2  = 0.66, n = 39). Furthermore, path analysis confirmed that the Pb concentrations in the soil (Pb soil ), soil pH, and soil organic carbon (OC) were the primary controlling factors of Pb earthworm with high pathway parameters (0.71, - 0.51, and - 0.49, respectively). The predictive model based on Pb earthworm in a nationwide range of soils with low-level lead contamination could provide a reference for the establishment of safety thresholds in Pb-contaminated soils from the perspective of soil-animal systems.

  6. Logging impacts on avian species richness and composition differ across latitudes relative to foraging and breeding habitat preferences

    USGS Publications Warehouse

    LaManna, Joseph A.; Martin, Thomas E.

    2017-01-01

    Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes.

  7. Logging impacts on avian species richness and composition differ across latitudes and foraging and breeding habitat preferences.

    PubMed

    LaManna, Joseph A; Martin, Thomas E

    2017-08-01

    Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes. © 2016 Cambridge Philosophical Society.

  8. Environmental management in resource-rich Alberta, Canada: first world jurisdiction, Third World analogue?

    PubMed

    Timoney, K; Lee, P

    2001-12-01

    Economic growth is frequently touted as a cure for environmental ills, particularly for those in Third World countries. Here we examine that paradigm in a case study of Alberta, Canada, a wealthy, resource-rich province within a wealthy nation. Through provincial-scale datasets, we examine the increasing pressures of the forest, petroleum, and agricultural industries upon the ecosystems of Alberta within management, economic, and political contexts. We advance the thesis that economic activity leads to environmental degradation unless ecosystem-based management is integrated into economic decision making. Agricultural lands cover 31.7%, and forest management areas leased to industry cover 33.4% of Alberta; both continue to increase in extent. The rate of logging (focused on old-growth by government policy) continues a decades-long exponential rise. Current Alberta annual petroleum production is 52.5 million m3 crude oil and 117 billion m3 of gas. As of early 1999, there were approximately 199,025 oil and gas wells and a conservative total of approximately 1.5-1.8 million km of seismic lines in Alberta. Fire occurrence data indicate no downward trends in annual area burned by wildfire, which may be characterized as driven by climate and inherently variable. When logging and wildfire are combined, the annual allowable cut in Alberta is unsustainable, even when only timber supply is considered and the effects of expanding agriculture and oil and gas activities are ignored. Ecosystem degradation in Alberta is pervasive and contrasts prominently with a high standard of living. A wealth of ecological data exists that indicates current resource-based economic activities are non-sustainable and destructive of ecosystem health yet these data are not considered within the economic decision making process. Given the complex, compounded, and increasing ecosystem perturbations, a future of unpleasant ecological surprises is likely. We conclude with tentative predictions as to where current trends in Alberta may lead if decisions biased against ecosystems continue.

  9. Testing and analysis of internal hardwood log defect prediction models

    Treesearch

    R. Edward Thomas

    2011-01-01

    The severity and location of internal defects determine the quality and value of lumber sawn from hardwood logs. Models have been developed to predict the size and position of internal defects based on external defect indicator measurements. These models were shown to predict approximately 80% of all internal knots based on external knot indicators. However, the size...

  10. Study of weak corrections to Drell-Yan, top-quark pair, and dijet production at high energies with MCFM

    DOE PAGES

    Campbell, John M.; Wackeroth, Doreen; Zhou, Jia

    2016-11-29

    Electroweak (EW) corrections can be enhanced at high energies due to the soft or collinear radiation of virtual and real W and Z bosons that result in Sudakov-like corrections of the form αmore » $$l\\atop{W}$$log n(Q 2/M2$$\\atop{W,Z}$$), where α W=α/(4π sin 2θ W) and n ≤ 2l-1. The inclusion of EW corrections in predictions for hadron colliders is, therefore, especially important when searching for signals of possible new physics in distributions probing the kinematic regime Q 2>>M$$2\\atop{V}$$. Next-to-leading order (NLO) EW corrections should also be taken into account when their size [O(α)] is comparable to that of QCD corrections at next-to-next-to-leading order (NNLO) [O(α$$2\\atop{s}$$)]. To this end, we have implemented the NLO weak corrections to the neutral-current Drell-Yan process, top-quark pair production and dijet production in the parton-level Monte Carlo program MCFM. This enables a combined study with the corresponding QCD corrections at NLO and NNLO. We provide both the full NLO weak corrections and their Sudakov approximation since the latter is often used for a fast evaluation of weak effects at high energies and can be extended to higher orders. Finally, with both the exact and approximate results at hand, the validity of the Sudakov approximation can be readily quantified.« less

  11. Avian responses to selective logging shaped by species traits and logging practices

    PubMed Central

    Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S.; Koh, Lian Pin

    2015-01-01

    Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673

  12. SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Clasie, B; Lu, H

    Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less

  13. Singular and combined effects of blowdown, salvage logging, and wildfire on forest floor and soil mercury pools.

    PubMed

    Mitchell, Carl P J; Kolka, Randall K; Fraver, Shawn

    2012-08-07

    A number of factors influence the amount of mercury (Hg) in forest floors and soils, including deposition, volatile emission, leaching, and disturbances such as fire. Currently the impact on soil Hg pools from other widespread forest disturbances such as blowdown and management practices like salvage logging are unknown. Moreover, ecological and biogeochemical responses to disturbances are generally investigated within a single-disturbance context, with little currently known about the impact of multiple disturbances occurring in rapid succession. In this study we capitalize on a combination of blowdown, salvage logging and fire events in the sub-boreal region of northern Minnesota to assess both the singular and combined effects of these disturbances on forest floor and soil total Hg concentrations and pools. Although none of the disturbance combinations affected Hg in mineral soil, we did observe significant effects on both Hg concentrations and pools in the forest floor. Blowdown increased the mean Hg pool in the forest floor by 0.76 mg Hg m(-2) (223%). Salvage logging following blowdown created conditions leading to a significantly more severe forest floor burn during wildfire, which significantly enhanced Hg emission. This sequence of combined events resulted in a mean loss of approximately 0.42 mg Hg m(-2) (68% of pool) from the forest floor, after conservatively accounting for potential losses via enhanced soil leaching and volatile emissions between the disturbance and sampling dates. Fire alone or blowdown followed by fire did not significantly affect the total Hg concentrations or pools in the forest floor. Overall, unexpected consequences for soil Hg accumulation and by extension, atmospheric Hg emission and risk to aquatic biota, may result when combined impacts are considered in addition to singular forest floor and soil disturbances.

  14. Earth Observations

    NASA Image and Video Library

    2010-07-25

    ISS024-E-009526 (25 July 2010) --- Dominic Point Fire in Montana is featured in this image photographed by an Expedition 24 crew member on the International Space Station. Lightning strikes in the forested mountains of the western United States, and human activities, can spark wild fires during the summer dry season. The Dominic Point Fire was first reported near 3:00 p.m. local time on July 25 2010. Approximately one hour later, the space station crew photographed the fire?s large smoke plume ? already extending at least eight kilometers to the east ? from orbit as they passed almost directly overhead. Forest Service fire crews, slurry bombers and helicopters were on the scene by that evening. The fire may have been started by a lightning strike, as there are no trails leading into the fire area located approximately 22 kilometers northeast of Hamilton, MT (according to local reports). As of July 26, 2010 the fire had burned approximately 283?405 hectares of the Bitterroot National Forest in western Montana. The fire is thought to have expanded quickly due to high temperatures, low humidity, and favorable winds with an abundance of deadfall ? dead trees and logs that provide readily combustible fuels ? in the area.

  15. Wane detection on rough lumber using surface approximation

    Treesearch

    Sang-Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt

    2000-01-01

    The initial breakdown of hardwood logs into lumber produces boards with rough surfaces. These boards contain wane (missing wood due to the curved log exterior) that is removed by edge and trim cuts prior to sale. Because hardwood lumber value is determined using a combination of board size and quality, knowledge of wane position and defects is essential for selecting...

  16. 77 FR 55459 - Takes of Marine Mammals Incidental to Specified Activities; Piling and Fill Removal in Woodard...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-10

    ... and are expected to occur for approximately 20 days over the course of this work window. Other work... log boom haul-outs located in the action area. Other potential disturbance could result from the..., primarily by flushing seals off log booms, or by causing short-term avoidance of the area or similar short...

  17. Estimating the Aqueous Solubility of Pharmaceutical Hydrates.

    PubMed

    Franklin, Stephen J; Younis, Usir S; Myrdal, Paul B

    2016-06-01

    Estimation of crystalline solute solubility is well documented throughout the literature. However, the anhydrous crystal form is typically considered with these models, which is not always the most stable crystal form in water. In this study, an equation which predicts the aqueous solubility of a hydrate is presented. This research attempts to extend the utility of the ideal solubility equation by incorporating desolvation energetics of the hydrated crystal. Similar to the ideal solubility equation, which accounts for the energetics of melting, this model approximates the energy of dehydration to the entropy of vaporization for water. Aqueous solubilities, dehydration and melting temperatures, and log P values were collected experimentally and from the literature. The data set includes different hydrate types and a range of log P values. Three models are evaluated, the most accurate model approximates the entropy of dehydration (ΔSd) by the entropy of vaporization (ΔSvap) for water, and utilizes onset dehydration and melting temperatures in combination with log P. With this model, the average absolute error for the prediction of solubility of 14 compounds was 0.32 log units. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  18. Shear viscosity of the quark-gluon plasma in a weak magnetic field in perturbative QCD: Leading log

    NASA Astrophysics Data System (ADS)

    Li, Shiyong; Yee, Ho-Ung

    2018-03-01

    We compute the shear viscosity of two-flavor QCD plasma in an external magnetic field in perturbative QCD at leading log order, assuming that the magnetic field is weak or soft: e B ˜g4log (1 /g )T2. We work in the assumption that the magnetic field is homogeneous and static, and the electrodynamics is nondynamical in a formal limit e →0 while e B is kept fixed. We show that the shear viscosity takes a form η =η ¯(B ¯)T3/(g4log (1 /g )) with a dimensionless function η ¯(B ¯) in terms of a dimensionless variable B ¯=(e B )/(g4log (1 /g )T2). The variable B ¯ corresponds to the relative strength of the effect of cyclotron motions compared to the QCD collisions: B ¯˜lmfp/lcyclo. We provide a full numerical result for the scaled shear viscosity η ¯(B ¯).

  19. Log-Normal Turbulence Dissipation in Global Ocean Models

    NASA Astrophysics Data System (ADS)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  20. The fate of verocytotoxigenic Escherichia coli C600φ3538(Δvtx2 ::cat) and its vtx2 prophage during grass silage preparation.

    PubMed

    Nyambe, S; Burgess, C; Whyte, P; O'Kiely, P; Bolton, D

    2017-05-01

    Silage is grass, preserved by fermentation and used as winter feed for cattle. The impact of a range of current grass silage preparation practices on the survival of Escherichia coli C600φ3538(Δvtx 2 ::cat) and on the induction, release and infectivity of free phage were investigated. Wilted and fresh grass samples, from plots with and without slurry application, were ensiled with or without formic acid. Each treatment combination was inoculated with approximately 6 log 10 CFU per g E. coli C600φ3538(Δvtx 2 ::cat) (donor strain) and E. coli C600::kanamycin R (recipient strain) in test-tube model silos and incubated in the dark at 15°C. The physico-chemical (pH, ammonia, ethanol, lactic acid and volatile fatty acids) and microbiological (total viable counts, TVC, total Enterobacteriaceae counts, TEC, E. coli counts, ECC and lactic acid bacteria, LAB) properties of each fermentation were monitored throughout the experiment as were the concentrations of E. coli C600φ3538(Δvtx 2 ::cat), E. coli C600::kanamycin R , free phage and transductants, using culture and PCR-based methods. Over the course of the experiment the pH of the grass samples typically decreased by 2 pH units. TVC, TEC and ECC decreased by up to 2·3, 6·4 and 6·2 log 10 CFU per g, respectively, while the LAB counts remained relatively stable at 5·2-7·1 log 10 CFU per g. Both donor and recipient strains decreased by approximately 5 log 10 CFU per g. Free phages were detected in all treatments and transductants were detected and confirmed by PCR in the silo containing wilted grass, pretreated with slurry and ensiled without formic acid. Verocytotoxigenic E. coli may survive the ensiling process and the conditions encountered are sufficient to induce vtx 2 bacteriophage leading to low levels of phage-mediated vtx 2 gene transfer. These studies suggest that the ensiling of grass may create an environment which facilitates the emergence of new verocytotoxigenic E. coli. © 2017 The Society for Applied Microbiology.

  1. Opacity, metallicity, and Cepheid period ratios in the galaxy and Magellanic Clouds

    NASA Technical Reports Server (NTRS)

    Simon, Norman R.; Kanbur, Shashi M.

    1994-01-01

    Linear pulsation calculations are employed to reproduce the bump Cepheid resonance (P(sub 2)/P(sub 0) = 0.5 at P(sub 0) approximately equal to 10 days) and to model, individually, the P(sub 1)/P(sub 0) period ratios for the dozen known Galactic beat Cepheids. Convection is ignored. The results point to a range of metallicity among the Cepheids, perhaps as large as 0.01 approximately less than Z approximately less than 0.02, with no evidence for any star exceeding Z = 0.02. We find masses and luminosities which range from M approximately less than 4 solar mass, log(base 10) approximately less than 3.0 at P(sub 0) approximately equal to 3 days to M approximately less than 6 solar mass, log(base 10) L approximately greater than 3.5 at P(sub 0) approximately equal to 10 days. Similar parameters are indicated for the P(sub 0) approximately equal to 10 days Cepheids in the LMC and SMC, provided that the resonance for these stars occurs at a slightly longer period, P(sub 0) days, as has been suggested in the literature. Our calculations were performed mainly using OPAL opacities, but also with new opacities from the Opacity project (OP). Only small differences were found between the OPAL results and those from OP. Finally, some suggestions are made for possible future work, including evolution and pulsation calculations, and more precise observations of Cepheids in the Magellanic Clouds.

  2. Efficacy of acidified sodium chlorite treatments in reducing Escherichia coli O157:H7 on Chinese cabbage.

    PubMed

    Inatsu, Yasuhiro; Bari, Md Latiful; Kawasaki, Susumu; Isshiki, Kenji; Kawamoto, Shinichi

    2005-02-01

    Efficacy of acidified sodium chlorite for reducing the population of Escherichia coli O157:H7 pathogens on Chinese cabbage leaves was evaluated. Washing leaves with distilled water could reduce the population of E. coli O157:H7 by approximately 1.0 log CFU/g, whereas treating with acidified chlorite solution could reduce the population by 3.0 log CFU/g without changing the leaf color. A similar level of reduction was achieved by washing with sodium chlorite solution containing various organic acids. However, acidified sodium chlorite in combination with a mild heat treatment reduced the population by approximately 4.0 log CFU/g without affecting the color, but it softened the leaves. Moreover, the efficacy of the washing treatment was similar at low (4 degrees C) and room (25 degrees C) temperatures, indicating that acidified sodium chloride solution could be useful as a sanitizer for surface washing of fresh produce.

  3. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models

    PubMed Central

    Rajasekaran, Sanguthevar

    2013-01-01

    Efficient tile sets for self assembling rectilinear shapes is of critical importance in algorithmic self assembly. A lower bound on the tile complexity of any deterministic self assembly system for an n × n square is Ω(log(n)log(log(n))) (inferred from the Kolmogrov complexity). Deterministic self assembly systems with an optimal tile complexity have been designed for squares and related shapes in the past. However designing Θ(log(n)log(log(n))) unique tiles specific to a shape is still an intensive task in the laboratory. On the other hand copies of a tile can be made rapidly using PCR (polymerase chain reaction) experiments. This led to the study of self assembly on tile concentration programming models. We present two major results in this paper on the concentration programming model. First we show how to self assemble rectangles with a fixed aspect ratio (α:β), with high probability, using Θ(α + β) tiles. This result is much stronger than the existing results by Kao et al. (Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008) and Doty (Randomized self-assembly for exact shapes. In: proceedings of the 50th annual IEEE symposium on foundations of computer science (FOCS), IEEE, Atlanta. pp 85–94, 2009)—which can only self assembly squares and rely on tiles which perform binary arithmetic. On the other hand, our result is based on a technique called staircase sampling. This technique eliminates the need for sub-tiles which perform binary arithmetic, reduces the constant in the asymptotic bound, and eliminates the need for approximate frames (Kao et al. Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008). Our second result applies staircase sampling on the equimolar concentration programming model (The tile complexity of linear assemblies. In: proceedings of the 36th international colloquium automata, languages and programming: Part I on ICALP ’09, Springer-Verlag, pp 235–253, 2009), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)—n being an upper bound on the dimensions of a rectangle. PMID:24311993

  4. Structural lumber laminated from 1/4-inch rotary-peeled southern pine veneer

    Treesearch

    Peter Koch

    1972-01-01

    By the lamination process evaluated, 60 percent of total log volume ended as kiln-dry, end-trimmed, sized, salable 2 by 4's - approximately 50 percent more than that acheived by conventional bandsawing of matched logs. Moreover, modulus of elasticity of the laminated 2 by 4's (adjusted to 12 percent moisture content) averaged 1,950,000 psi compared to 1,790,...

  5. Development of Stable, Low Resistance Solder Joints for a Space-Flight HTS Lead Assemblies

    NASA Technical Reports Server (NTRS)

    Canavan, Edgar R.; Chiao, Meng; Panashchenko, Lyudmyla; Sampson, Michael

    2017-01-01

    The solder joints in spaceflight high temperature superconductor (HTS) lead assemblies for certain astrophysics missions have strict constraints on size and power dissipation. In addition, the joints must tolerate years of storage at room temperature, many thermal cycles, and several vibration tests between their manufacture and their final operation on orbit. As reported previously, solder joints between REBCO coated conductors and normal metal traces for the Astro-H mission showed low temperature joint resistance that grew approximately as log time over the course of months. Although the assemblies worked without issue in orbit, for the upcoming X-ray Astrophysics Recovery Mission we are attempting to improve our solder process to give lower, more stable, and more consistent joint resistance. We produce numerous sample joints and measure time- and thermal cycle-dependent resistance, and characterize the joints using x-ray and other analysis tools. For a subset of the joints, we use SEMEDS to try to understand the physical and chemical processes that effect joint behavior.

  6. Towards next-to-next-to-leading-log accuracy for the width difference in the {B}_s-{\\overline{B}}_s system: fermionic contributions to order ( m c /m b )0 and ( m c /m b )1

    NASA Astrophysics Data System (ADS)

    Asatrian, H. M.; Hovhannisyan, A.; Nierste, U.; Yeghiazaryan, A.

    2017-10-01

    We calculate a class of three-loop Feynman diagrams which contribute to the next-to-next-to-leading logarithmic approximation for the width difference ΔΓ s in the {B}_s-{\\overline{B}}_s system. The considered diagrams contain a closed fermion loop in a gluon propagator and constitute the order α s 2 N f , where N f is the number of light quarks. Our results entail a considerable correction in that order, if ΔΓ s is expressed in terms of the pole mass of the bottom quark. If the \\overline{MS} scheme is used instead, the correction is much smaller. As a result, we find a decrease of the scheme dependence. Our result also indicates that the usually quoted value of the NLO renormalization scale dependence underestimates the perturbative error.

  7. Spatial resolution properties of motion-compensated tomographic image reconstruction methods.

    PubMed

    Chun, Se Young; Fessler, Jeffrey A

    2012-07-01

    Many motion-compensated image reconstruction (MCIR) methods have been proposed to correct for subject motion in medical imaging. MCIR methods incorporate motion models to improve image quality by reducing motion artifacts and noise. This paper analyzes the spatial resolution properties of MCIR methods and shows that nonrigid local motion can lead to nonuniform and anisotropic spatial resolution for conventional quadratic regularizers. This undesirable property is akin to the known effects of interactions between heteroscedastic log-likelihoods (e.g., Poisson likelihood) and quadratic regularizers. This effect may lead to quantification errors in small or narrow structures (such as small lesions or rings) of reconstructed images. This paper proposes novel spatial regularization design methods for three different MCIR methods that account for known nonrigid motion. We develop MCIR regularization designs that provide approximately uniform and isotropic spatial resolution and that match a user-specified target spatial resolution. Two-dimensional PET simulations demonstrate the performance and benefits of the proposed spatial regularization design methods.

  8. Estimation of norovirus infection risks to consumers of wastewater-irrigated food crops eaten raw.

    PubMed

    Mara, Duncan; Sleigh, Andrew

    2010-03-01

    A quantitative microbial risk analysis-Monte Carlo method was used to estimate norovirus infection risks to consumers of wastewater-irrigated lettuce. Using the same assumptions as used in the 2006 WHO guidelines for the safe use of wastewater in agriculture, a norovirus reduction of 6 log units was required to achieve a norovirus infection risk of approximately 10(-3) per person per year (pppy), but for a lower consumption of lettuce (40-48 g per week vs. 350 g per week) the required reduction was 5 log units. If the tolerable additional disease burden is increased from a DALY (disability-adjusted life year) loss of 10(-6) pppy (the value used in the WHO guidelines) to 10(-5) pppy, the required pathogen reduction is one order of magnitude lower. Reductions of 4-6 log units can be achieved by very simple partial treatment (principally settling to achieve a 1-log unit reduction) supplemented by very reliable post-treatment health-protection control measures such as pathogen die-off (1-2 log units), produce washing in cold water (1 log unit) and produce disinfection (3 log units).

  9. Integrating Multiple Subsurface Exploration Technologies in Slope Hydrogeologic Investigation: A Case Study in Taiwan

    NASA Astrophysics Data System (ADS)

    Lo, H.-C.; Hsu, S.-M.; Jeng, D.-I.; Ku, C.-Y.

    2009-04-01

    Taiwan is an island located at a tectonically active collision zone between the Eurasian Plate and the Pacific Plate. Also, the island is in the subtropical climate region with frequent typhoon events that are always accompanied by intense rainfalls within a short period of time. These seismic and climatic elements frequently trigger, directly or indirectly, natural disasters such as landslides on the island with casualties and property damages. Prompted by the urge for minimizing the detrimental effects of such natural disasters, Taiwan government has initiated and funded a series of investigations and studies aimed at better understanding the causes of the natural disasters that may lead to the formulation of more effective disaster contingency plans and possibly some forecasts system. The hydrogeology of a landslide site can help unveil the detention condition of storm water entering the aquifer system of the slope as well as its groundwater condition which, in turn, plays a critical role in slope stability. In this study, a hydrogeologic investigation employing a series of subsurface exploration technologies was conducted at an active landslide site in the vicinity of Hwa Yuan Village in northern Taiwan. The site, which covers an area of approximately 0.14 km2 (35 acres) and generally ranges between 25 to 36 degree in slope, was initially investigated with ground resistivity image profiling (RIP) and electrical logging in order to determine the lithology and possibly the water-bearing capacity of the geologic units beneath the slope surface. Subsequently, both acoustic and optical borehole loggings were then applied to identify potentially significant fracture features at depth and their hydrogeologic implications. In addition, flowmeter loggings and hydraulic packer tests were conducted to further characterize the hydrogeologic system of the site and quantitatively determine the hydraulic properties of major hydrogeologic units. According to the ground resistivity profiles combined with rock core data, the geologic units can be primarily categorized into colluvium and weathered rock at depths of 4-23 m and 23-80 m, respectively. An approximately 20 m shear zone at depths of 45-65 m was found based on the detection outcome of low electrical resistance. Also, according to the borehole electrical logging, the layer of sandstone was identified in the interval of 48-59 m and 68.5-74 m and showed low water-bearing capacity. In addition, the electrical logging identified the layer of shale was in the interval of 59-68.5 m, which possessed a high water-bearing capacity. The velocity profile along the borehole was obtained from the flowmeter logging. A relatively high velocity zone (1.36~2.23 m/min) was measured in the interval of sandstone and relatively low velocity zone (0.12~0.78 m/min) was measured in the interval of shale, which is similar to those found in electrical logging. Moreover, 198 discontinuity planes were identified from the borehole image logging. The orientations of all discontinuities were calculated and compiled to draw a stereographic projection diagram. Judging from the discontinuity clusters on the stereographic projection diagram, a plane failure may possibly occur based on Hoek and Brown's criteria. This is a good demonstration that slope failure geometry and type can be determined by stereographic projection diagram analysis. The borehole images also clearly showed the structures of discontinuities at depth. They not only helped to characterize the results of the above investigation technologies but also provided useful indication in selecting specific geologic intervals for packer tests. The packer tests were conducted and the intervals were isolated based on the results of borehole and flowmeter logging. They indicated that the hydraulic conductivities of the shale and sandstone intervals are respectively 1.37Ã-10-8 m/sec and 2.68Ã-10-5-3.76Ã-10-5 m/sec, which are in good accordance with the hydraulic characteristics inferred by flowmeter logging. The aforementioned investigation results, including the geology units and water-bearing capacity categorized by RIP and electrical logging, velocity and hydraulic conductivity obtained from flowmeter logging and packer test, and discontinuity structures recorded by borehole image logging, were used to clarify the complexity of the subsurface environment and to establish the hydrogeologic conceptual model of the landslide site.

  10. Attenuation of ground-motion spectral amplitudes in southeastern Australia

    USGS Publications Warehouse

    Allen, T.I.; Cummins, P.R.; Dhu, T.; Schneider, J.F.

    2007-01-01

    A dataset comprising some 1200 weak- and strong-motion records from 84 earthquakes is compiled to develop a regional ground-motion model for southeastern Australia (SEA). Events were recorded from 1993 to 2004 and range in size from moment magnitude 2.0 ??? M ??? 4.7. The decay of vertical-component Fourier spectral amplitudes is modeled by trilinear geometrical spreading. The decay of low-frequency spectral amplitudes can be approximated by the coefficient of R-1.3 (where R is hypocentral distance) within 90 km of the seismic source. From approximately 90 to 160 km, we observe a transition zone in which the seismic coda are affected by postcritical reflections from midcrustal and Moho discontinuities. In this hypocentral distance range, geometrical spreading is approximately R+0.1. Beyond 160 km, low-frequency seismic energy attenuates rapidly with source-receiver distance, having a geometrical spreading coefficient of R-1.6. The associated regional seismic-quality factor can be expressed by the polynomial: log Q(f) = 3.66 - 1.44 log f + 0.768 (log f)2 + 0.058 (log f)3 for frequencies 0.78 ??? f ??? 19.9 Hz. Fourier spectral amplitudes, corrected for geometrical spreading and anelastic attenuation, are regressed with M to obtain quadratic source scaling coefficients. Modeled vertical-component displacement spectra fit the observed data well. Amplitude residuals are, on average, relatively small and do not vary with hypocentral distance. Predicted source spectra (i.e., at R = 1 km) are consistent with eastern North American (ENA) Models at low frequencies (f less than approximately 2 Hz) indicating that moment magnitudes calculated for SEA earthquakes are consistent with moment magnitude scales used in ENA over the observed magnitude range. The models presented represent the first spectral ground-motion prediction equations develooed for the southeastern Australian region. This work provides a useful framework for the development of regional ground-motion relations for earthquake hazard and risk assessment in SEA.

  11. Iowa Saw-Log Production and Sawmill Industry, 1969

    Treesearch

    James E. Blyth

    1972-01-01

    Iowa loggers harvested nearly 47 million board feet of saw logs in 1969. Leading species were soft maple, elm, red oak, and cottonwood. Three-fifths of the wood residue generated at 63 Iowa sawmills was not used.

  12. Comparison of formation and fluid-column logs in a heterogeneous basalt aquifer

    USGS Publications Warehouse

    Paillet, F.L.; Williams, J.H.; Oki, D.S.; Knutson, K.D.

    2002-01-01

    Deep observation boreholes in the vicinity of active production wells in Honolulu, Hawaii, exhibit the anomalous condition that fluid-column electrical conductivity logs and apparent profiles of pore-water electrical conductivity derived from induction conductivity logs are nearly identical if a formation factor of 12.5 is assumed. This condition is documented in three boreholes where fluid-column logs clearly indicate the presence of strong borehole flow induced by withdrawal from partially penetrating water-supply wells. This result appears to contradict the basic principles of conductivity-log interpretation. Flow conditions in one of these boreholes was investigated in detail by obtaining flow profiles under two water production conditions using the electromagnetic flowmeter. The flow-log interpretation demonstrates that the fluid-column log resembles the induction log because the amount of inflow to the borehole increases systematically upward through the transition zone between deeper salt water and shallower fresh water. This condition allows the properties of the fluid column to approximate the properties of water entering the borehole as soon as the upflow stream encounters that producing zone. Because this condition occurs in all three boreholes investigated, the similarity of induction and fluid-column logs is probably not a coincidence, and may relate to aquifer response under the influence of pumping from production wells.

  13. Comparison of formation and fluid-column logs in a heterogeneous basalt aquifer.

    PubMed

    Paillet, F L; Williams, J H; Oki, D S; Knutson, K D

    2002-01-01

    Deep observation boreholes in the vicinity of active production wells in Honolulu, Hawaii, exhibit the anomalous condition that fluid-column electrical conductivity logs and apparent profiles of pore-water electrical conductivity derived from induction conductivity logs are nearly identical if a formation factor of 12.5 is assumed. This condition is documented in three boreholes where fluid-column logs clearly indicate the presence of strong borehole flow induced by withdrawal from partially penetrating water-supply wells. This result appears to contradict the basic principles of conductivity-log interpretation. Flow conditions in one of these boreholes was investigated in detail by obtaining flow profiles under two water production conditions using the electromagnetic flowmeter. The flow-log interpretation demonstrates that the fluid-column log resembles the induction log because the amount of inflow to the borehole increases systematically upward through the transition zone between deeper salt water and shallower fresh water. This condition allows the properties of the fluid column to approximate the properties of water entering the borehole as soon as the upflow stream encounters that producing zone. Because this condition occurs in all three boreholes investigated, the similarity of induction and fluid-column logs is probably not a coincidence, and may relate to aquifer response under the influence of pumping from production wells.

  14. Opportunities to use bark polyphenols in specialty chemical markets

    Treesearch

    Richard W. Hemingway

    1998-01-01

    Current forestry practice in North America is to transport pulpwood and logs from the harvest site to the mill with the bark on the wood. Approximately 18 percent of the weight of logs from conifers such as southern pine is bark. The majority of this bark is burned as hog fuel, but its fuel value is low. When compared with natural gas at an average of $2.50/MBTU or...

  15. Analytical expressions for the log-amplitude correlation function of a plane wave through anisotropic atmospheric refractive turbulence.

    PubMed

    Gudimetla, V S Rao; Holmes, Richard B; Smith, Carey; Needham, Gregory

    2012-05-01

    The effect of anisotropic Kolmogorov turbulence on the log-amplitude correlation function for plane-wave fields is investigated using analysis, numerical integration, and simulation. A new analytical expression for the log-amplitude correlation function is derived for anisotropic Kolmogorov turbulence. The analytic results, based on the Rytov approximation, agree well with a more general wave-optics simulation based on the Fresnel approximation as well as with numerical evaluations, for low and moderate strengths of turbulence. The new expression reduces correctly to previously published analytic expressions for isotropic turbulence. The final results indicate that, as asymmetry becomes greater, the Rytov variance deviates from that given by the standard formula. This deviation becomes greater with stronger turbulence, up to moderate turbulence strengths. The anisotropic effects on the log-amplitude correlation function are dominant when the separation of the points is within the Fresnel length. In the direction of stronger turbulence, there is an enhanced dip in the correlation function at a separation close to the Fresnel length. The dip is diminished in the weak-turbulence axis, suggesting that energy redistribution via focusing and defocusing is dominated by the strong-turbulence axis. The new analytical expression is useful when anisotropy is observed in relevant experiments. © 2012 Optical Society of America

  16. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  17. Geophysical Log Database for the Mississippi Embayment Regional Aquifer Study (MERAS)

    USGS Publications Warehouse

    Hart, Rheannon M.; Clark, Brian R.

    2008-01-01

    The Mississippi Embayment Regional Aquifer Study (MERAS) is an investigation of ground-water availability and sustainability within the Mississippi embayment as part of the U.S. Geological Survey Ground-Water Resources Program. The MERAS area consists of approximately 70,000 square miles and encompasses parts of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. More than 2,600 geophysical logs of test holes and wells within the MERAS area were compiled into a database and were used to develop a digital hydrogeologic framework from land surface to the top of the Midway Group of upper Paleocene age. The purpose of this report is to document, present, and summarize the geophysical log database, as well as to preserve the geophysical logs in a digital image format for online access.

  18. Incidence and behavior of Salmonella and Escherichia coli on whole and sliced zucchini squash (Cucurbitapepo) fruit.

    PubMed

    Castro-Rosas, Javier; Santos López, Eva María; Gómez-Aldapa, Carlos Alberto; González Ramírez, Cesar Abelardo; Villagomez-Ibarra, José Roberto; Gordillo-Martínez, Alberto José; López, Angélica Villarruel; del Refugio Torres-Vitela, M

    2010-08-01

    The incidence of coliform bacteria (CB), thermotolerant coliforms (TC), Escherichia coli, and Salmonella was determined for zucchini squash fruit. In addition, the behavior of four serotypes of Salmonella and a cocktail of three E. coli strains on whole and sliced zucchini squash at 25+/-2 degrees C and 3 to 5 degrees C was tested. Squash fruit was collected in the markets of Pachuca city, Hidalgo State, Mexico. CB, TC, E. coli, and Salmonella were detected in 100, 70, 62, and 10% of the produce, respectively. The concentration ranged from 3.8 to 7.4 log CFU per sample for CB, and >3 to 1,100 most probable number per sample for TC and E. coli. On whole fruit stored at 25+/-2 degrees C or 3 to 5 degrees C, no growth was observed for any of the tested microorganisms or cocktails thereof. After 15 days at 25+/-2 degrees C, the tested Salmonella serotypes had decreased from an initial inoculum level of 7 log CFU to <1 log, and at 3 to 5 degrees C they decreased to approximately 2 log. Survival of E. coli was significantly greater than for the Salmonella strains at the same times and temperatures; after 15 days, at 25+/-2 degrees C E. coli cocktail strains had decreased to 3.4 log CFU per fruit and at 3 to 5 degrees C they decreased to 3.6 log CFU per fruit. Both the Salmonella serotypes and E. coli strains grew when inoculated onto sliced squash: after 24 h at 25+/-2 degrees C, both bacteria had grown to approximately 6.5 log CFU per slice. At 3 to 5 degrees C, the bacterial growth was inhibited. The squash may be an important factor contributing to the endemicity of Salmonella in Mexico.

  19. Distribution and characterization of in-channel large wood in relation to geomorphic patterns on a low-gradient river

    USGS Publications Warehouse

    Moulin, Bertrand; Schenk, Edward R.; Hupp, Cliff R.

    2011-01-01

    A 177 river km georeferenced aerial survey of in-channel large wood (LW) on the lower Roanoke River, NC was conducted to determine LW dynamics and distributions on an eastern USA low-gradient large river. Results indicate a system with approximately 75% of the LW available for transport either as detached individual LW or as LW in log jams. There were approximately 55 individual LW per river km and another 59 pieces in log jams per river km. Individual LW is a product of bank erosion (73% is produced through erosion) and is isolated on the mid and upper banks at low flow. This LW does not appear to be important for either aquatic habitat or as a human risk. Log jams rest near or at water level making them a factor in bank complexity in an otherwise homogenous fine-grained channel. A segmentation test was performed using LW frequency by river km to detect breaks in longitudinal distribution and to define homogeneous reaches of LWfrequency. Homogeneous reaches were then analyzed to determine their relationship to bank height, channel width/depth, sinuosity, and gradient. Results show that log jams are a product of LW transport and occur more frequently in areas with high snag concentrations, low to intermediate bank heights, high sinuosity, high local LW recruitment rates, and narrow channel widths. The largest concentration of log jams (21.5 log jams/km) occurs in an actively eroding reach. Log jam concentrations downstream of this reach are lower due to a loss of river competency as the channel reaches sea level and the concurrent development of unvegetated mudflats separating the active channel from the floodplain forest. Substantial LW transport occurs on this low-gradient, dam-regulated large river; this study, paired with future research on transport mechanisms should provide resource managers and policymakers with options to better manage aquatic habitat while mitigating possible negative impacts to human interests.

  20. Log analysis of six boreholes in conjunction with geologic characterization above and on top of the Weeks Island salt dome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattler, A.R.

    1996-04-01

    Six boreholes were drilled during the geologic characterization and diagnostics of the Weeks Island sinkhole that is over the two-tiered salt mine which was converted for oil storage by the US Strategic Petroleum Reserve. These holes were drilled to provide for geologic characterization of the Weeks Island Salt Dome and its overburden in the immediate vicinity of the sinkhole (mainly through logs and core); to establish a crosswell configuration for seismic tomography; to establish locations for hydrocarbon detection and tracer injection; and to Provide direct observations of sinkhole geometry and material properties. Specific objectives of the logging program were to:more » (1) identify the top of and the physical state of the salt dome; (2) identify the water table; (3) obtain a relative salinity profile in the aquifer within the alluvium, which ranges from the water table directly to the top of the Weeks Island salt dome; and (4) identify a reflecting horizon seen on seismic profiles over this salt dome. Natural gamma, neutron, density, sonic, resistivity and caliper logs were run. Neutron and density logs were run from inside the well casing because of the extremely unstable condition of the deltaic alluvium overburden above the salt dome. The logging program provided important information about the salt dome and the overburden in that (1) the top of the salt dome was identified at {approximately}189 ft bgl (103 ft msl), and the top of the dome contains relatively few fractures; (2) the water table is approximately 1 ft msl, (3) this aquifer appears to become steadily more saline with depth; and (4) the water saturation of much of the alluvium over the salt dome is shown to be influenced by the prevalent heavy rainfall. This logging program, a part of the sinkhole diagnostics, provides unique information about this salt dome and the overburden.« less

  1. Study of Magnitudes, Seismicity and Earthquake Detectability Using a Global Network

    DTIC Science & Technology

    1984-06-01

    Ings. the stations are then further classified into three groups: j . A Stations reporting a P-detection with an associated log(A/ T ) value (V...detections, nondetections and reported log(A/ T ) values for the j’th event. given that Its true magnitude is/u j . where t .-. ( + j )1J jL (4) Aj~~ -.Q’ +Qj...subset for which I., ’. % , ILriLI-W I tT W W9 .11mtl’%r’ vl.:", Q -A -Pk,7,W-W ’ dW .P ,[ J -4-- log(A/ T ) was reported. As a first order approximation

  2. Enhanced Removal of a Human Norovirus Surrogate from Fresh Vegetables and Fruits by a Combination of Surfactants and Sanitizers▿

    PubMed Central

    Predmore, Ashley; Li, Jianrong

    2011-01-01

    Fruits and vegetables are major vehicles for transmission of food-borne enteric viruses since they are easily contaminated at pre- and postharvest stages and they undergo little or no processing. However, commonly used sanitizers are relatively ineffective for removing human norovirus surrogates from fresh produce. In this study, we systematically evaluated the effectiveness of surfactants on removal of a human norovirus surrogate, murine norovirus 1 (MNV-1), from fresh produce. We showed that a panel of surfactants, including sodium dodecyl sulfate (SDS), Nonidet P-40 (NP-40), Triton X-100, and polysorbates, significantly enhanced the removal of viruses from fresh fruits and vegetables. While tap water alone and chlorine solution (200 ppm) gave only <1.2-log reductions in virus titer in all fresh produce, a solution containing 50 ppm of surfactant was able to achieve a 3-log reduction in virus titer in strawberries and an approximately 2-log reduction in virus titer in lettuce, cabbage, and raspberries. Moreover, a reduction of approximately 3 logs was observed in all the tested fresh produce after sanitization with a solution containing a combination of 50 ppm of each surfactant and 200 ppm of chlorine. Taken together, our results demonstrate that the combination of a surfactant with a commonly used sanitizer enhanced the efficiency in removing viruses from fresh produce by approximately 100 times. Since SDS is an FDA-approved food additive and polysorbates are recognized by the FDA as GRAS (generally recognized as safe) products, implementation of this novel sanitization strategy would be a feasible approach for efficient reduction of the virus load in fresh produce. PMID:21622782

  3. Logging damage to residual trees following partial cutting in a green ash-sugarberry stand in the Mississippi Delta

    Treesearch

    James S. Meadows

    1993-01-01

    Partial cutting in bottomland hardwoods to control stand density and species composition sometimes results in logging damage to the lower bole and/or roots of residual trees. If severe, logging damage may lead to a decline in tree vigor, which may subsequently stimulate the production of epicormic branches, causing a decrease in bole quality and an eventual loss in...

  4. Logging Damage to Residual Trees Following Partial Cutting in a Green Ash-Sugarberry Stand in the Mississippi Delta

    Treesearch

    James S. Meadows

    1993-01-01

    Partial cutting in bottomland hardwoods to control stand density and species composition sometimes results in logging damage to the lower bole and/or roots of residual trees. If severe, logging damage may lead to a decline in tree vigor, which may subsequently stimulate the production of epicormic branches, causing a decrease in bole quality and an eventual loss in...

  5. Investigating the Metallicity–Mixing-length Relation

    NASA Astrophysics Data System (ADS)

    Viani, Lucas S.; Basu, Sarbani; Joel Ong J., M.; Bonaca, Ana; Chaplin, William J.

    2018-05-01

    Stellar models typically use the mixing-length approximation as a way to implement convection in a simplified manner. While conventionally the value of the mixing-length parameter, α, used is the solar-calibrated value, many studies have shown that other values of α are needed to properly model stars. This uncertainty in the value of the mixing-length parameter is a major source of error in stellar models and isochrones. Using asteroseismic data, we determine the value of the mixing-length parameter required to properly model a set of about 450 stars ranging in log g, {T}eff}, and [{Fe}/{{H}}]. The relationship between the value of α required and the properties of the star is then investigated. For Eddington atmosphere, non-diffusion models, we find that the value of α can be approximated by a linear model, in the form of α /{α }ȯ =5.426{--}0.101 {log}(g)-1.071 {log}({T}eff}) +0.437([{Fe}/{{H}}]). This process is repeated using a variety of model physics, as well as compared with previous studies and results from 3D convective simulations.

  6. Removal of indigenous coliphages and enteric viruses during riverbank filtration from highly polluted river water in Delhi (India).

    PubMed

    Sprenger, C; Lorenzen, G; Grunert, A; Ronghang, M; Dizer, H; Selinka, H-C; Girones, R; Lopez-Pila, J M; Mittal, A K; Szewzyk, R

    2014-06-01

    Emerging countries frequently afflicted by waterborne diseases require safe and cost-efficient production of drinking water, a task that is becoming more challenging as many rivers carry a high degree of pollution. A study was conducted on the banks of the Yamuna River, Delhi, India, to ascertain if riverbank filtration (RBF) can significantly improve the quality of the highly polluted surface water in terms of virus removal (coliphages, enteric viruses). Human adenoviruses and noroviruses, both present in the Yamuna River in the range of 10(5) genomes/100 mL, were undetectable after 50 m infiltration and approximately 119 days of underground passage. Indigenous somatic coliphages, used as surrogates of human pathogenic viruses, underwent approximately 5 log10 removal after only 3.8 m of RBF. The initial removal after 1 m was 3.3 log10, and the removal between 1 and 2.4 m and between 2.4 and 3.8 m was 0.7 log10 each. RBF is therefore an excellent candidate to improve the water situation in emerging countries with respect to virus removal.

  7. On the origins of logarithmic number-to-position mapping.

    PubMed

    Dotan, Dror; Dehaene, Stanislas

    2016-11-01

    The number-to-position task, in which children and adults are asked to place numbers on a spatial number line, has become a classic measure of number comprehension. We present a detailed experimental and theoretical dissection of the processing stages that underlie this task. We used a continuous finger-tracking technique, which provides detailed information about the time course of processing stages. When adults map the position of 2-digit numbers onto a line, their final mapping is essentially linear, but intermediate finger location show a transient logarithmic mapping. We identify the origins of this log effect: Small numbers are processed faster than large numbers, so the finger deviates toward the target position earlier for small numbers than for large numbers. When the trajectories are aligned on the finger deviation onset, the log effect disappears. The small-number advantage and the log effect are enhanced in dual-task setting and are further enhanced when the delay between the 2 tasks is shortened, suggesting that these effects originate from a central stage of quantification and decision making. We also report cases of logarithmic mapping-by children and by a brain-injured individual-which cannot be explained by faster responding to small numbers. We show that these findings are captured by an ideal-observer model of the number-to-position mapping task, comprising 3 distinct stages: a quantification stage, whose duration is influenced by both exact and approximate representations of numerical quantity; a Bayesian accumulation-of-evidence stage, leading to a decision about the target location; and a pointing stage. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  9. Control of Listeria monocytogenes growth in soft cheeses by bacteriophage P100.

    PubMed

    Silva, Elaine Nóbrega Gibson; Figueiredo, Ana Cláudia Leite; Miranda, Fernanda Araújo; de Castro Almeida, Rogeria Comastri

    2014-01-01

    The purpose of this study was to determine the effect of bacteriophage P100 on strains of Listeria monocytogenes in artificially inoculated soft cheeses. A mix of L. monocytogenes 1/2a and Scott A was inoculated in Minas Frescal and Coalho cheeses (approximately 10(5) cfu/g) with the bacteriophage added thereafter (8.3 × 10(7) PFU/g). Samples were analyzed immediately, and then stored at 10 °C for seven days. At time zero, 30 min post-infection, the bacteriophage P100 reduced L. monocytogenes counts by 2.3 log units in Minas Frescal cheese and by 2.1 log units in Coalho cheese, compared to controls without bacteriophage. However, in samples stored under refrigeration for seven days, the bacteriophage P100 was only weakly antilisterial, with the lowest decimal reduction (DR) for the cheeses: 1.0 log unit for Minas Frescal and 0.8 log units for Coalho cheese. The treatment produced a statistically significant decrease in the counts of viable cells (p < 0.05) and in all assays performed, we observed an increase of approximately one log cycle in the number of viable cells of L. monocytogenes in the samples under refrigeration for seven days. Moreover, a smaller effect of phages was observed. These results, along with other published data, indicate that the effectiveness of the phage treatment depends on the initial concentration of L. monocytogenes, and that a high concentration of phages per unit area is required to ensure sustained inactivation of target pathogens on food surfaces.

  10. Sanitizing in Dry-Processing Environments Using Isopropyl Alcohol Quaternary Ammonium Formula.

    PubMed

    Kane, Deborah M; Getty, Kelly J K; Mayer, Brian; Mazzotta, Alejandro

    2016-01-01

    Dry-processing environments are particularly challenging to clean and sanitize because introduced water can favor growth and establishment of pathogenic microorganisms such as Salmonella. Our objective was to determine the efficacy of an isopropyl alcohol quaternary ammonium (IPAQuat) formula for eliminating potential Salmonella contamination on food contact surfaces. Clean stainless steel coupons and conveyor belt materials used in dry-processing environments were spot inoculated in the center of coupons (5 by 5 cm) with a six-serotype composite of Salmonella (approximately 10 log CFU/ml), subjected to IPAQuat sanitizer treatments with exposure times of 30 s, 1 min, or 5 min, and then swabbed for enumeration of posttreatment survivors. A subset of inoculated surfaces was soiled with a breadcrumb-flour blend and allowed to sit on the laboratory bench for a minimum of 16 h before sanitation. Pretreatment Salmonella populations (inoculated controls, 0 s treatment) were approximately 7.0 log CFU/25 cm(2), and posttreatment survivors were 1.31, 0.72, and < 0.7 (detection limit) log CFU/25 cm(2) after sanitizer exposure for 30 s, 1 min, or 5 min, respectively, for both clean (no added soil) and soiled surfaces. Treatment with the IPAQuat formula using 30-s sanitizer exposures resulted in 5.68-log reductions, whereas >6.0-log reductions were observed for sanitizer exposures of 1 and 5 min. Because water is not introduced into the processing environment with this approach, the IPAQuat formula could have sanitation applications in dry-processing environments to eliminate potential contamination from Salmonella on food contact surfaces.

  11. The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space

    PubMed Central

    Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri

    2015-01-01

    When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes—phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass. PMID:26465336

  12. Multiphase gas in quasar absorption-line systems

    NASA Technical Reports Server (NTRS)

    Giroux, Mark L.; Sutherland, Ralph S.; Shull, J. Michael

    1994-01-01

    In the standard model for H I Lyman-limit (LL) quasar absorption-line systems, the absorbing matter is galactic disk and halo gas, heated and photoionized by the metagalactic radiation field produced by active galaxies. In recent Hubble Space Telescope (HST) observations (Reimers et al. 1992; Vogel & Reimers 1993; Reimers & Vogel 1993) of LL systems along the line of sight to the quasar HS 1700+6416, surprisingly high He I/H I ratios and a wide distribution of column densities of C, N, and O ions are deduced from extreme ultraviolet absorption lines. We show that these observations are incompatible with photoionization equilibrium by a single metagalactic ionizing background. We argue that these quasar absorption systems possess a multiphase interstellar medium similar to that of our Galaxy, in which extended hot, collisionally ionized gas is responsible for some or all of the high ionization stages of heavy elements. From the He/H ratios we obtain -4.0 less than or = log U less than or = -3.0, while the CNO ions are consistent with hot gas in collisional ionization equilibrium at log T = 5.3 and (O/H) = -1.6. The supernova rate necessary to produce these heavy elements and maintain the hot-gas energy budget of approximately 10(exp 41.5) ergs/s is approximately 10(exp -2)/yr, similar to that which maintains the 'three-phase' interstellar medium in our own Galaxy. As a consequence of the change in interpretation from photoionized gas to a multiphase medium, the derived heavy-element abundances (e.g., O/C) of these systems are open to question owing to substantial ionization corrections for unseen C V in the hot phase. The metal-line ratios may also lead to erroneous diagnostics of the shape of the metagalactic ionizaing spectrum and the ionizing parameter of the absorbers.

  13. An insight into the photodynamic approach versus copper formulations in the control of Pseudomonas syringae pv. actinidiae in kiwi plants.

    PubMed

    Jesus, Vânia; Martins, Diana; Branco, Tatiana; Valério, Nádia; Neves, Maria G P M S; Faustino, Maria A F; Reis, Luís; Barreal, Esther; Gallego, Pedro P; Almeida, Adelaide

    2018-02-14

    In the last decade, the worldwide production of kiwi fruit has been highly affected by Pseudomonas syringae pv. actinidiae (Psa), a phytopathogenic bacterium; this has led to severe economic losses that are seriously affecting the kiwi fruit trade. The available treatments for this disease are still scarce, with the most common involving frequently spraying the orchards with copper derivatives, in particular cuprous oxide (Cu 2 O). However, these copper formulations should be avoided due to their high toxicity; therefore, it is essential to search for new approaches for controlling Psa. Antimicrobial photodynamic therapy (aPDT) may be an alternative approach to inactivate Psa. aPDT consists in the use of a photosensitizer molecule (PS) that absorbs light and by transference of the excess of energy or electrons to molecular oxygen forms highly reactive oxygen species (ROS) that can affect different molecular targets, thus being very unlikely to lead to the development of microbe resistance. The aim of the present study was to evaluate the effectiveness of aPDT to photoinactivate Psa, using the porphyrin Tetra-Py + -Me and different light intensities. The degree of inactivation of Psa was assessed using the PS at 5.0 μM under low irradiance (4.0 mW cm -2 ). Afterward, ex vivo experiments, using artificially contaminated kiwi leaves, were conducted with a PS at 50 μM under 150 mW cm -2 and sunlight irradiation. A reduction of 6 log in the in vitro assays after 90 min of irradiation was observed. In the ex vivo tests, the decrease was lower, approximately 1.8 log reduction at an irradiance of 150 mW cm -2 , 1.2 log at 4.0 mW cm -2 , and 1.5 log under solar radiation. However, after three successive cycles of treatment under 150 mW cm -2 , a 4 log inactivation was achieved. No negative effects were observed on leaves after treatment. Assays using Cu 2 O were also performed at the recommended concentration by law (50 g h L -1 ) and at concentrations 10 times lower, in which at both concentrations, Psa was efficiently inactivated (5 log inactivation) after a few minutes of treatment, but negative effects were observed on the leaves after treatment.

  14. Karhunen Loève approximation of random fields by generalized fast multipole methods

    NASA Astrophysics Data System (ADS)

    Schwab, Christoph; Todor, Radu Alexandru

    2006-09-01

    KL approximation of a possibly instationary random field a( ω, x) ∈ L2( Ω, d P; L∞( D)) subject to prescribed meanfield Ea(x)=∫a(ω,x) dP(ω) and covariance Va(x,x')=∫(a(ω,x)-Ea(x))(a(ω,x')-Ea(x')) dP(ω) in a polyhedral domain D⊂Rd is analyzed. We show how for stationary covariances Va( x, x') = ga(| x - x'|) with ga( z) analytic outside of z = 0, an M-term approximate KL-expansion aM( ω, x) of a( ω, x) can be computed in log-linear complexity. The approach applies in arbitrary domains D and for nonseparable covariances Ca. It involves Galerkin approximation of the KL eigenvalue problem by discontinuous finite elements of degree p ⩾ 0 on a quasiuniform, possibly unstructured mesh of width h in D, plus a generalized fast multipole accelerated Krylov-Eigensolver. The approximate KL-expansion aM( x, ω) of a( x, ω) has accuracy O(exp(- bM1/ d)) if ga is analytic at z = 0 and accuracy O( M- k/ d) if ga is Ck at zero. It is obtained in O( MN(log N) b) operations where N = O( h- d).

  15. Fracture identification based on remote detection acoustic reflection logging

    NASA Astrophysics Data System (ADS)

    Zhang, Gong; Li, Ning; Guo, Hong-Wei; Wu, Hong-Liang; Luo, Chao

    2015-12-01

    Fracture identification is important for the evaluation of carbonate reservoirs. However, conventional logging equipment has small depth of investigation and cannot detect rock fractures more than three meters away from the borehole. Remote acoustic logging uses phase-controlled array-transmitting and long sound probes that increase the depth of investigation. The interpretation of logging data with respect to fractures is typically guided by practical experience rather than theory and is often ambiguous. We use remote acoustic reflection logging data and high-order finite-difference approximations in the forward modeling and prestack reverse-time migration to image fractures. First, we perform forward modeling of the fracture responses as a function of the fracture-borehole wall distance, aperture, and dip angle. Second, we extract the energy intensity within the imaging area to determine whether the fracture can be identified as the formation velocity is varied. Finally, we evaluate the effect of the fracture-borehole distance, fracture aperture, and dip angle on fracture identification.

  16. Log-amplitude statistics for Beck-Cohen superstatistics

    NASA Astrophysics Data System (ADS)

    Kiyono, Ken; Konno, Hidetoshi

    2013-05-01

    As a possible generalization of Beck-Cohen superstatistical processes, we study non-Gaussian processes with temporal heterogeneity of local variance. To characterize the variance heterogeneity, we define log-amplitude cumulants and log-amplitude autocovariance and derive closed-form expressions of the log-amplitude cumulants for χ2, inverse χ2, and log-normal superstatistical distributions. Furthermore, we show that χ2 and inverse χ2 superstatistics with degree 2 are closely related to an extreme value distribution, called the Gumbel distribution. In these cases, the corresponding superstatistical distributions result in the q-Gaussian distribution with q=5/3 and the bilateral exponential distribution, respectively. Thus, our finding provides a hypothesis that the asymptotic appearance of these two special distributions may be explained by a link with the asymptotic limit distributions involving extreme values. In addition, as an application of our approach, we demonstrated that non-Gaussian fluctuations observed in a stock index futures market can be well approximated by the χ2 superstatistical distribution with degree 2.

  17. Early addition of topical corticosteroids in the treatment of bacterial keratitis.

    PubMed

    Ray, Kathryn J; Srinivasan, Muthiah; Mascarenhas, Jeena; Rajaraman, Revathi; Ravindran, Meenakshi; Glidden, David V; Oldenburg, Catherine E; Sun, Catherine Q; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2014-06-01

    Scarring from bacterial keratitis remains a leading cause of visual loss. To determine whether topical corticosteroids are beneficial as an adjunctive therapy for bacterial keratitis if given early in the course of infection. The Steroids for Corneal Ulcers Trial (SCUT) was a randomized, double-masked, placebo-controlled trial that overall found no effect of adding topical corticosteroids to topical moxifloxacin hydrochloride in bacterial keratitis. Here, we assess the timing of administration of corticosteroids in a subgroup analysis of the SCUT. We define earlier administration of corticosteroids (vs placebo) as addition after 2 to 3 days of topical antibiotics and later as addition after 4 or more days of topical antibiotics. We assess the effect of topical corticosteroids (vs placebo) on 3-month best spectacle-corrected visual acuity in patients who received corticosteroids or placebo earlier vs later. Further analyses were performed for subgroups of patients with non-Nocardia keratitis and those with no topical antibiotic use before enrollment. Patients treated with topical corticosteroids as adjunctive therapy within 2 to 3 days of antibiotic therapy had approximately 1-line better visual acuity at 3 months than did those given placebo (-0.11 logMAR; 95% CI, -0.20 to -0.02 logMAR; P = .01). In patients who had 4 or more days of antibiotic therapy before corticosteroid treatment, the effect was not significant; patients given corticosteroids had 1-line worse visual acuity at 3 months compared with those in the placebo group (0.10 logMAR; 95% CI, -0.02 to 0.23 logMAR; P = .14). Patients with non-Nocardia keratitis and those having no topical antibiotic use before the SCUT enrollment showed significant improvement in best spectacle-corrected visual acuity at 3 months if corticosteroids were administered earlier rather than later. There may be a benefit with adjunctive topical corticosteroids if application occurs earlier in the course of bacterial corneal ulcers.

  18. Short-term ECG recording for the identification of cardiac autonomic neuropathy in people with diabetes mellitus

    NASA Astrophysics Data System (ADS)

    Jelinek, Herbert F.; Pham, Phuong; Struzik, Zbigniew R.; Spence, Ian

    2007-07-01

    Diabetes mellitus (DM) is a serious and increasing health problem worldwide. Compared to non-diabetics, patients experience an increased risk of all cardiovascular diseases, including dysfunctional neural control of the heart. Poor diagnoses of cardiac autonomic neuropathy (CAN) may result in increased incidence of silent myocardial infarction and ischaemia, which can lead to sudden death. Traditionally the Ewing battery of tests is used to identify CAN. The purpose of this study is to examine the usefulness of heart rate variability (HRV) analyses of short-term ECG recordings as a method for detecting CAN. HRV may be able to identify asymptomatic individuals, which the Ewing battery is not able to do. Several HRV parameters are assessed, including time and frequency domain, as well as nonlinear parameters. Eighteen out of thirty-eight individuals with diabetes were positive for two or more of the Ewing battery of tests indicating CAN. Approximate Entropy (ApEn), log normalized total power (LnTP) and log normalized high frequency (LnHF) power demonstrate a significant difference at p < 0.05 between CAN+ and CAN-. This indicates that nonlinear scaling parameters are able to identify people with cardiac autonomic neuropathy in short ECG recordings. Our study paves the way to assess the utility of nonlinear parameters in identifying asymptomatic CAN.

  19. Heterogeneity of fluvial-deltaic reservoirs in the Appalachian basin: A case study from a Lower Mississippian oil field in central West Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohn, M.E.; McDowell, R.R.; Matchen, D.L.

    1997-06-01

    Since discovery in 1924, Granny Creek field in central West Virginia has experienced several periods of renewed drilling for oil in a fluvial-deltaic sandstone in the Lower Mississippian Price Formation. Depositional and diagenetic features leading to reservoir heterogeneity include highly variable grain size, thin shale and siltstone beds, and zones containing large quantities of calcite, siderite, or quartz cement. Electrofacies defined through cluster analysis of wireline log responses corresponded approximately to facies observed in core. Three-dimensional models of porosity computed from density logs showed that zones of relatively high porosity were discontinuous across the field. The regression of core permeabilitymore » on core porosity is statistically significant, and differs for each electrofacies. Zones of high permeability estimated from porosity and electrofacies tend to be discontinuous and aligned roughly north-south. Cumulative oil production varies considerably between adjacent wells, and corresponds very poorly with trends in porosity and permeability. Original oil in place, estimated for each well from reservoir thickness, porosity, water saturation, and an assumed value for drainage radius, is highly variable in the southern part of the field, which is characterized by relatively complex interfingering of electrofacies and similar variability in porosity and permeability.« less

  20. Peats from West Kalimantan Province, Indonesia: The Distribution, Formation, Disturbances, Utilization and Conservation

    NASA Astrophysics Data System (ADS)

    Anshari, G. Z.

    2011-12-01

    A major portion of tropical peats, approximately between 180,000 and 210,000 km2, occurs in Indonesia. Peat is a water body that preserves and stores enormous organic Carbon of dead biomass vegetation. In a natural state, peat helps to maintain Carbon balance, hydrological cycle, and supply of dissolved and particulate organic matters into adjacent waters. Peat disturbances drive the change from Carbon sink function into Carbon source. This paper aims to discuss variability of tropical peats and peat degradation in West Kalimantan Province. The discussions include extent and formation, biodiversity, Carbon and water storage, major properties, utilization, peat disturbances (i.e. logging, forest conversion, drainage affects, and recurrent peat fires), and peat conservation. Management options for reducing peat fires and developing sustainable peat utilization are also explored. Data were collected from both coastal and inland peats in West Kalimantan Province. This paper declares that degradation of tropical peats in Indonesia is strongly associated with anthropogenic fires, peat forest conversion, and logging. To reduce speeds of peat degradation, the current utilization of peats needs being more intensive than extensive, and preventing water table drop by managing excessive drainage that leads to substantial decline of moisture in the upper peat layer, which is subsequently dry and flammable.

  1. Disinfection of a probe used in ultrasound-guided prostate biopsy.

    PubMed

    Rutala, William A; Gergen, Maria F; Weber, David J

    2007-08-01

    Transrectal ultrasound (TRUS)-guided prostate biopsies are among the most common outpatient diagnostic procedures in urology clinics and carry the risk of introducing pathogens that may lead to infection. To investigate the effectiveness of procedures for disinfecting a probe used in ultrasound-guided prostate biopsy. The effectiveness of disinfection was determined by inoculating 10(7) colony forming units (cfu) of Pseudomonas aeruginosa at the following 3 sites on the probe: the interior lumen of the biopsy needle guide, the outside surface of the biopsy needle guide, and the interior lumen of the ultrasound probe where the needle guide passes through the transducer. Each site was investigated separately. After inoculation, the probe was immersed in 2% glutaraldehyde for 20 minutes and then assessed for the level of microbial contamination. The results demonstrated that disinfection (ie, a reduction in bacterial load of greater than 7 log(10) cfu) could be achieved if the needle guide was removed from the probe. However, if the needle guide was left in the probe channel during immersion in 2% glutaraldehyde, disinfection was not achieved (ie, the reduction was approximately 1 log(10) cfu). Recommendations for probe disinfection are provided and include disassembling the device and immersing the probe and the needle guide separately in a high-level disinfectant.

  2. Mutagenesis of a bacteriophage lytic enzyme PlyGBS significantly increases its antibacterial activity against group B streptococci.

    PubMed

    Cheng, Qi; Fischetti, Vincent A

    2007-04-01

    Group B streptococci (GBS) are the leading cause of neonatal meningitis and sepsis worldwide. Intrapartum antibiotic prophylaxis (IAP) is the current prevention strategy given to pregnant women with confirmed vaginal GBS colonization. Due to antibiotic resistance identified in GBS, we previously developed another strategy using a bacteriophage lytic enzyme, PlyGBS, to reduce vaginal GBS colonization. In this study, various DNA mutagenesis methods were explored to produce PlyGBS mutants with increased lytic activity against GBS. Several hyperactive mutants were identified that contain only the endopeptidase domain found in the N-terminal region of PlyGBS and represent only about one-third of the wild-type PlyGBS in length. Significantly, these mutants not only have 18-28-fold increases in specific activities compared to PlyGBS, but they also have a similar activity spectrum against several streptococcal species. One of the hyperactive mutants, PlyGBS90-1, reduced the GBS colonization from >5 logs of growth per mouse to <50 colony-forming units (cfu) 4 h post treatment ( approximately 4-log reduction) using a single dose in a mouse vaginal model. A reduction in GBS colonization before delivery should significantly reduce neonatal GBS infection providing a safe alternative to IAP.

  3. Chronic Myeloid Leukemia

    MedlinePlus

    ... medication. Protein Synthesis Inhibitor Therapy. Omacetaxine (Synribo®) a non-TKI, chemotherapy drug, is approved for chronic and ... as BCR-ABL 10%. This reduction is approximately equivalent to a major cytogenetic response. {{ A 2-log ...

  4. R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation

    NASA Astrophysics Data System (ADS)

    Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao

    2017-03-01

    Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.

  5. Testing the Dose–Response Specification in Epidemiology: Public Health and Policy Consequences for Lead

    PubMed Central

    Rothenberg, Stephen J.; Rothenberg, Jesse C.

    2005-01-01

    Statistical evaluation of the dose–response function in lead epidemiology is rarely attempted. Economic evaluation of health benefits of lead reduction usually assumes a linear dose–response function, regardless of the outcome measure used. We reanalyzed a previously published study, an international pooled data set combining data from seven prospective lead studies examining contemporaneous blood lead effect on IQ (intelligence quotient) of 7-year-old children (n = 1,333). We constructed alternative linear multiple regression models with linear blood lead terms (linear–linear dose response) and natural-log–transformed blood lead terms (log-linear dose response). We tested the two lead specifications for nonlinearity in the models, compared the two lead specifications for significantly better fit to the data, and examined the effects of possible residual confounding on the functional form of the dose–response relationship. We found that a log-linear lead–IQ relationship was a significantly better fit than was a linear–linear relationship for IQ (p = 0.009), with little evidence of residual confounding of included model variables. We substituted the log-linear lead–IQ effect in a previously published health benefits model and found that the economic savings due to U.S. population lead decrease between 1976 and 1999 (from 17.1 μg/dL to 2.0 μg/dL) was 2.2 times ($319 billion) that calculated using a linear–linear dose–response function ($149 billion). The Centers for Disease Control and Prevention action limit of 10 μg/dL for children fails to protect against most damage and economic cost attributable to lead exposure. PMID:16140626

  6. Exponential series approaches for nonparametric graphical models

    NASA Astrophysics Data System (ADS)

    Janofsky, Eric

    Markov Random Fields (MRFs) or undirected graphical models are parsimonious representations of joint probability distributions. This thesis studies high-dimensional, continuous-valued pairwise Markov Random Fields. We are particularly interested in approximating pairwise densities whose logarithm belongs to a Sobolev space. For this problem we propose the method of exponential series which approximates the log density by a finite-dimensional exponential family with the number of sufficient statistics increasing with the sample size. We consider two approaches to estimating these models. The first is regularized maximum likelihood. This involves optimizing the sum of the log-likelihood of the data and a sparsity-inducing regularizer. We then propose a variational approximation to the likelihood based on tree-reweighted, nonparametric message passing. This approximation allows for upper bounds on risk estimates, leverages parallelization and is scalable to densities on hundreds of nodes. We show how the regularized variational MLE may be estimated using a proximal gradient algorithm. We then consider estimation using regularized score matching. This approach uses an alternative scoring rule to the log-likelihood, which obviates the need to compute the normalizing constant of the distribution. For general continuous-valued exponential families, we provide parameter and edge consistency results. As a special case we detail a new approach to sparse precision matrix estimation which has statistical performance competitive with the graphical lasso and computational performance competitive with the state-of-the-art glasso algorithm. We then describe results for model selection in the nonparametric pairwise model using exponential series. The regularized score matching problem is shown to be a convex program; we provide scalable algorithms based on consensus alternating direction method of multipliers (ADMM) and coordinate-wise descent. We use simulations to compare our method to others in the literature as well as the aforementioned TRW estimator.

  7. Back in the saddle: large-deviation statistics of the cosmic log-density field

    NASA Astrophysics Data System (ADS)

    Uhlemann, C.; Codis, S.; Pichon, C.; Bernardeau, F.; Reimberg, P.

    2016-08-01

    We present a first principle approach to obtain analytical predictions for spherically averaged cosmic densities in the mildly non-linear regime that go well beyond what is usually achieved by standard perturbation theory. A large deviation principle allows us to compute the leading order cumulants of average densities in concentric cells. In this symmetry, the spherical collapse model leads to cumulant generating functions that are robust for finite variances and free of critical points when logarithmic density transformations are implemented. They yield in turn accurate density probability distribution functions (PDFs) from a straightforward saddle-point approximation valid for all density values. Based on this easy-to-implement modification, explicit analytic formulas for the evaluation of the one- and two-cell PDF are provided. The theoretical predictions obtained for the PDFs are accurate to a few per cent compared to the numerical integration, regardless of the density under consideration and in excellent agreement with N-body simulations for a wide range of densities. This formalism should prove valuable for accurately probing the quasi-linear scales of low-redshift surveys for arbitrary primordial power spectra.

  8. “Blogging” About Course Concepts: Using Technology for Reflective Journaling in a Communications Class

    PubMed Central

    Bouldin, Alicia S.; Holmes, Erin R.; Fortenberry, Michael L.

    2006-01-01

    Objective Web log technology was applied to a reflective journaling exercise in a communication course during the second-professional year at the University of Mississippi School of Pharmacy, to encourage students to reflect on course concepts and apply them to the environment outside the classroom, and to assess their communication performance. Design Two Web log entries per week were required for full credit. Web logs were evaluated at three points during the term. At the end of the course, students evaluated the assignment using a 2-page survey instrument. Assessment The assignment contributed to student learning and increased awareness level for approximately 40% of the class. Students had few complaints about the logistics of the assignment. Conclusion The Web log technology was a useful tool for reflective journaling in this communications course. Future versions of the assignment will benefit from student feedback from this initial experience. PMID:17136203

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neu, Mary Patricia

    The coordination chemistry and solution behavior of the toxic ions lead(II) and plutonium(IV, V, VI) have been investigated. The ligand pK as and ligand-lead(II) stability constants of one hydroxamic acid and four thiohydroaxamic acids were determined. Solution thermodynamic results indicate that thiohydroxamic acids are more acidic and slightly better lead chelators than hydroxamates, e.g., N-methylthioaceto-hydroxamic acid, pK a = 5.94, logβ 120 = 10.92; acetohydroxamic acid, pK a = 9.34, logβ 120 = 9.52. The syntheses of lead complexes of two bulky hydroxamate ligands are presented. The X-ray crystal structures show the lead hydroxamates are di-bridged dimers with irregular five-coordinatemore » geometry about the metal atom and a stereochemically active lone pair of electrons. Molecular orbital calculations of a lead hydroxamate and a highly symmetric pseudo octahedral lead complex were performed. The thermodynamic stability of plutonium(IV) complexes of the siderophore, desferrioxamine B (DFO), and two octadentate derivatives of DFO were investigated using competition spectrophotometric titrations. The stability constant measured for the plutonium(IV) complex of DFO-methylterephthalamide is logβ 120 = 41.7. The solubility limited speciation of 242Pu as a function of time in near neutral carbonate solution was measured. Individual solutions of plutonium in a single oxidation state were added to individual solutions at pH = 6.0, T = 30.0, 1.93 mM dissolved carbonate, and sampled over intervals up to 150 days. Plutonium solubility was measured, and speciation was investigated using laser photoacoustic spectroscopy and chemical methods.« less

  10. The word frequency effect during sentence reading: A linear or nonlinear effect of log frequency?

    PubMed

    White, Sarah J; Drieghe, Denis; Liversedge, Simon P; Staub, Adrian

    2016-10-20

    The effect of word frequency on eye movement behaviour during reading has been reported in many experimental studies. However, the vast majority of these studies compared only two levels of word frequency (high and low). Here we assess whether the effect of log word frequency on eye movement measures is linear, in an experiment in which a critical target word in each sentence was at one of three approximately equally spaced log frequency levels. Separate analyses treated log frequency as a categorical or a continuous predictor. Both analyses showed only a linear effect of log frequency on the likelihood of skipping a word, and on first fixation duration. Ex-Gaussian analyses of first fixation duration showed similar effects on distributional parameters in comparing high- and medium-frequency words, and medium- and low-frequency words. Analyses of gaze duration and the probability of a refixation suggested a nonlinear pattern, with a larger effect at the lower end of the log frequency scale. However, the nonlinear effects were small, and Bayes Factor analyses favoured the simpler linear models for all measures. The possible roles of lexical and post-lexical factors in producing nonlinear effects of log word frequency during sentence reading are discussed.

  11. Log-Log Convexity of Type-Token Growth in Zipf's Systems

    NASA Astrophysics Data System (ADS)

    Font-Clos, Francesc; Corral, Álvaro

    2015-06-01

    It is traditionally assumed that Zipf's law implies the power-law growth of the number of different elements with the total number of elements in a system—the so-called Heaps' law. We show that a careful definition of Zipf's law leads to the violation of Heaps' law in random systems, with growth curves that have a convex shape in log-log scale. These curves fulfill universal data collapse that only depends on the value of Zipf's exponent. We observe that real books behave very much in the same way as random systems, despite the presence of burstiness in word occurrence. We advance an explanation for this unexpected correspondence.

  12. APPROXIMATION AND ESTIMATION OF s-CONCAVE DENSITIES VIA RÉNYI DIVERGENCES.

    PubMed

    Han, Qiyang; Wellner, Jon A

    2016-01-01

    In this paper, we study the approximation and estimation of s -concave densities via Rényi divergence. We first show that the approximation of a probability measure Q by an s -concave density exists and is unique via the procedure of minimizing a divergence functional proposed by [ Ann. Statist. 38 (2010) 2998-3027] if and only if Q admits full-dimensional support and a first moment. We also show continuity of the divergence functional in Q : if Q n → Q in the Wasserstein metric, then the projected densities converge in weighted L 1 metrics and uniformly on closed subsets of the continuity set of the limit. Moreover, directional derivatives of the projected densities also enjoy local uniform convergence. This contains both on-the-model and off-the-model situations, and entails strong consistency of the divergence estimator of an s -concave density under mild conditions. One interesting and important feature for the Rényi divergence estimator of an s -concave density is that the estimator is intrinsically related with the estimation of log-concave densities via maximum likelihood methods. In fact, we show that for d = 1 at least, the Rényi divergence estimators for s -concave densities converge to the maximum likelihood estimator of a log-concave density as s ↗ 0. The Rényi divergence estimator shares similar characterizations as the MLE for log-concave distributions, which allows us to develop pointwise asymptotic distribution theory assuming that the underlying density is s -concave.

  13. APPROXIMATION AND ESTIMATION OF s-CONCAVE DENSITIES VIA RÉNYI DIVERGENCES

    PubMed Central

    Han, Qiyang; Wellner, Jon A.

    2017-01-01

    In this paper, we study the approximation and estimation of s-concave densities via Rényi divergence. We first show that the approximation of a probability measure Q by an s-concave density exists and is unique via the procedure of minimizing a divergence functional proposed by [Ann. Statist. 38 (2010) 2998–3027] if and only if Q admits full-dimensional support and a first moment. We also show continuity of the divergence functional in Q: if Qn → Q in the Wasserstein metric, then the projected densities converge in weighted L1 metrics and uniformly on closed subsets of the continuity set of the limit. Moreover, directional derivatives of the projected densities also enjoy local uniform convergence. This contains both on-the-model and off-the-model situations, and entails strong consistency of the divergence estimator of an s-concave density under mild conditions. One interesting and important feature for the Rényi divergence estimator of an s-concave density is that the estimator is intrinsically related with the estimation of log-concave densities via maximum likelihood methods. In fact, we show that for d = 1 at least, the Rényi divergence estimators for s-concave densities converge to the maximum likelihood estimator of a log-concave density as s ↗ 0. The Rényi divergence estimator shares similar characterizations as the MLE for log-concave distributions, which allows us to develop pointwise asymptotic distribution theory assuming that the underlying density is s-concave. PMID:28966410

  14. Exploring the Chemical Link Between Local Ellipticals and Their High-Redshift Progenitors

    NASA Technical Reports Server (NTRS)

    Leja, Joel; Van Dokkum, Pieter G.; Momcheva, Ivelina; Brammer, Gabriel; Skelton, Rosalind E.; Whitaker, Katherine E.; Andrews, Brett H.; Franx, Marijn; Kriek, Mariska; Van Der Wel, Arjen; hide

    2013-01-01

    We present Keck/MOSFIRE K-band spectroscopy of the first mass-selected sample of galaxies at z approximately 2.3. Targets are selected from the 3D-Hubble Space Telescope Treasury survey. The six detected galaxies have a mean [N II]lambda6584/H-alpha ratio of 0.27 +/- 0.01, with a small standard deviation of 0.05. This mean value is similar to that of UV-selected galaxies of the same mass. The mean gas-phase oxygen abundance inferred from the [N II]/Halpha ratios depends on the calibration method, and ranges from 12+log(O/H)(sub gas) = 8.57 for the Pettini & Pagel calibration to 12+log(O/H)(sub gas) = 8.87 for the Maiolino et al. calibration. Measurements of the stellar oxygen abundance in nearby quiescent galaxies with the same number density indicate 12+log(O/H)(sub stars) = 8.95, similar to the gas-phase abundances of the z approximately 2.3 galaxies if the Maiolino et al. calibration is used. This suggests that these high-redshift star forming galaxies may be progenitors of today's massive early-type galaxies. The main uncertainties are the absolute calibration of the gas-phase oxygen abundance and the incompleteness of the z approximately 2.3 sample: the galaxies with detected Ha tend to be larger and have higher star formation rates than the galaxies without detected H-alpha, and we may still be missing the most dust-obscured progenitors.

  15. Drilling, construction, geophysical log data, and lithologic log for boreholes USGS 142 and USGS 142A, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Twining, Brian V.; Hodges, Mary K.V.; Schusler, Kyle; Mudge, Christopher

    2017-07-27

    Starting in 2014, the U.S. Geological Survey in cooperation with the U.S. Department of Energy, drilled and constructed boreholes USGS 142 and USGS 142A for stratigraphic framework analyses and long-term groundwater monitoring of the eastern Snake River Plain aquifer at the Idaho National Laboratory in southeast Idaho. Borehole USGS 142 initially was cored to collect rock and sediment core, then re-drilled to complete construction as a screened water-level monitoring well. Borehole USGS 142A was drilled and constructed as a monitoring well after construction problems with borehole USGS 142 prevented access to upper 100 feet (ft) of the aquifer. Boreholes USGS 142 and USGS 142A are separated by about 30 ft and have similar geology and hydrologic characteristics. Groundwater was first measured near 530 feet below land surface (ft BLS) at both borehole locations. Water levels measured through piezometers, separated by almost 1,200 ft, in borehole USGS 142 indicate upward hydraulic gradients at this location. Following construction and data collection, screened water-level access lines were placed in boreholes USGS 142 and USGS 142A to allow for recurring water level measurements.Borehole USGS 142 was cored continuously, starting at the first basalt contact (about 4.9 ft BLS) to a depth of 1,880 ft BLS. Excluding surface sediment, recovery of basalt, rhyolite, and sediment core at borehole USGS 142 was approximately 89 percent or 1,666 ft of total core recovered. Based on visual inspection of core and geophysical data, material examined from 4.9 to 1,880 ft BLS in borehole USGS 142 consists of approximately 45 basalt flows, 16 significant sediment and (or) sedimentary rock layers, and rhyolite welded tuff. Rhyolite was encountered at approximately 1,396 ft BLS. Sediment layers comprise a large percentage of the borehole between 739 and 1,396 ft BLS with grain sizes ranging from clay and silt to cobble size. Sedimentary rock layers had calcite cement. Basalt flows ranged in thickness from about 2 to 100 ft and varied from highly fractured to dense, and ranged from massive to diktytaxitic to scoriaceous, in texture.Geophysical logs were collected on completion of drilling at boreholes USGS 142 and USGS 142A. Geophysical logs were examined with available core material to describe basalt, sediment and sedimentary rock layers, and rhyolite. Natural gamma logs were used to confirm sediment layer thickness and location; neutron logs were used to examine basalt flow units and changes in hydrogen content; gamma-gamma density logs were used to describe general changes in rock properties; and temperature logs were used to understand hydraulic gradients for deeper sections of borehole USGS 142. Gyroscopic deviation was measured to record deviation from true vertical at all depths in boreholes USGS 142 and USGS 142A.

  16. A clinical trial to evaluate the effects of flumethrin or ivermectin treatment on hemoparasites, gastrointestinal parasites, conception and daily weight gain in a dairy farm in Japan.

    PubMed

    Yamane, I; Arai, S; Nakamura, Y; Hisashi, M; Fukazawa, Y; Onuki, T

    2000-02-01

    A clinical trial was performed to compare the effects of flumethrin and ivermectin treatments of grazing heifers at one farm in central Japan. 64 heifers were randomly allocated into two groups. Flumethrin (1 mg/kg pour on) was applied approximately once every 3 weeks to heifers in one group and heifers in the second group were injected approximately once every month with ivermectin (200 microg/kg; id). Between groups, no significant differences were detected in the proportions of animals that showed parasitemia of Theileria sergenti and conception risks. Significantly lower average log-transformed nematode-egg counts and higher average daily weight gain were observed in the ivermectin-treated group. Animals with higher body weight at the start of grazing and lower log-transformed total nematode-egg and coccidia-oocyst counts had higher odds of conceiving. Animals with ivermectin treatment, lower body weight at the start of grazing and lower log-transformed coccidia-oocyst count had higher daily weight gain. Ivermectin may be more useful in this farm because of the higher productivity for cattle and lower cost for its usage.

  17. Geophysical logs for selected wells in the Picher Field, northeast Oklahoma and southeast Kansas

    USGS Publications Warehouse

    Christenson, Scott C.; Thomas, Tom B.; Overton, Myles D.; Goemaat, Robert L.; Havens, John S.

    1991-01-01

    The Roubidoux aquifer in northeastern Oklahoma is used extensively as a source of water for public supplies, commerce, industry, and rural water districts. The Roubidoux aquifer may be subject to contamination from abandoned lead and zinc mines of the Picher field. Water in flooded underground mines contains large concentrations of iron, zinc, cadmium, and lead. The contaminated water may migrate from the mines to the Roubidoux aquifer through abandoned water wells in the Picher field. In late 1984, the Oklahoma Water Resources Board began to locate abandoned wells that might be serving as conduits for the migration of contaminants from the abandoned mines. These wells were cleared of debris and plugged. A total of 66 wells had been located, cleared, and plugged by July 1985. In cooperation with the Oklahoma Water Resources Board, the U.S. Geological Survey took advantage of the opportunity to obtain geophysical data in the study area and provide the Oklahoma Water Resources Board with data that might be useful during the well-plugging operation. Geophysical logs obtained by the U.S. Geological Survey are presented in this report. The geophysical logs include hole diameter, normal, single-point resistance, fluid resistivity, natural-gamma, gamma-gamma, and neutron logs. Depths logged range from 145 to 1,344 feet.

  18. Novel denture-cleaning system based on hydroxyl radical disinfection.

    PubMed

    Kanno, Taro; Nakamura, Keisuke; Ikai, Hiroyo; Hayashi, Eisei; Shirato, Midori; Mokudai, Takayuki; Iwasawa, Atsuo; Niwano, Yoshimi; Kohno, Masahiro; Sasaki, Keiichi

    2012-01-01

    The purpose of this study was to evaluate a new denture-cleaning device using hydroxyl radicals generated from photolysis of hydrogen peroxide (H2O2). Electron spin resonance analysis demonstrated that the yield of hydroxyl radicals increased with the concentration of H2O2 and light irradiation time. Staphylococcus aureus, Pseudomonas aeruginosa, and methicillin-resistant S aureus were killed within 10 minutes with a > 5-log reduction when treated with photolysis of 500 mM H2O2; Candida albicans was killed within 30 minutes with a > 4-log reduction with photolysis of 1,000 mM H2O2. The clinical test demonstrated that the device could effectively reduce microorganisms in denture plaque by approximately 7-log order within 20 minutes.

  19. Logging-related increases in stream density in a northern California watershed

    Treesearch

    Matthew S. Buffleben

    2012-01-01

    Although many sediment budgets estimate the effects of logging, few have considered the potential impact of timber harvesting on stream density. Failure to consider changes in stream density could lead to large errors in the sediment budget, particularly between the allocation of natural and anthropogenic sources of sediment.This study...

  20. Pasteurization of shell eggs using radio frequency heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  1. Pasteurization of shell eggs using radio frequency heating

    DOE PAGES

    Geveke, David J.; Bigley, Andrew B. W.; Brunkhorst, Christopher D.

    2016-08-21

    The USDA-FSIS estimates that pasteurization of all shell eggs in the U.S. would reduce the annual number of illnesses by more than 110,000. However, less than 3% of shell eggs are commercially pasteurized. One of the main reasons for this is that the commercial hot water process requires as much as 60 min to complete. In the present study, a radio frequency (RF) apparatus was constructed, and a two-step process was developed that uses RF energy and hot water, to pasteurize eggs in less than half the time. In order to select an appropriate RF generator, the impedance of shellmore » eggs was measured in the frequency range of 10–70 MHz. The power density within the egg was modeled to prevent potential hotspots. Escherichia coli (ATCC 35218) was inoculated in the yolk to approximately 7.5 log CFU/ml. The combination process first heated the egg in 35.0 °C water for 3.5 min using 60 MHz RF energy. This resulted in the yolk being preferentially heated to 61 °C. Then, the egg was heated for an additional 20 min with 56.7 °C water. This two-step process reduced the population of E. coli by 6.5 log. The total time for the process was 23.5 min. By contrast, processing for 60 min was required to reduce the E. coli by 6.6 log using just hot water. The novel RF pasteurization process presented in this study was considerably faster than the existing commercial process. As a result, this should lead to an increase in the percentage of eggs being pasteurized, as well as a reduction of foodborne illnesses.« less

  2. Effect of sodium-alginate and laminaran on Salmonella Typhimurium infection in human enterocyte-like HT-29-Luc cells and BALB/c mice.

    PubMed

    Kuda, Takashi; Kosaka, Misa; Hirano, Shino; Kawahara, Miho; Sato, Masahiro; Kaneshima, Tai; Nishizawa, Makoto; Takahashi, Hajime; Kimura, Bon

    2015-07-10

    Brown algal polysaccharides such as alginate, polymers of uronic acids, and laminaran, beta-1,3 and 1,6-glucan, can be fermented by human intestinal microbiota. To evaluate the effects of these polysaccharides on infections caused by food poisoning pathogens, we investigated the adhesion and invasion of pathogens (Salmonella Typhimurium, Listeria monocytogenes and Vibrio parahaemolyticus) in human enterocyte-like HT-29-Luc cells and in infections caused in BALB/c mice. Both sodium Na-alginate and laminaran (0.1% each) inhibited the adhesion of the pathogens to HT-29-Luc cells by approximately 70-90%. The invasion of S. Typhimurium was also inhibited by approximately 70 and 80% by Na-alginate and laminaran, respectively. We observed that incubation with Na-alginate for 18 h increased the transepithelial electrical resistance of HT-29-Luc monolayer cells. Four days after inoculation with 7 log CFU/mouse of S. Typhimurium, the faecal pathogen count in mice that were not fed polysaccharides (control mice) was about 6.5 log CFU/g while the count in mice that were fed Na-alginate had decreased to 5.0 log CFU/g. The liver pathogen count, which was 4.1 log CFU/g in the control mice, was also decreased in mice that were fed Na-alginate. In contrast, the mice that were fed laminaran exhibited a more severe infection than that exhibited by control mice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiedemeier, Heribert, E-mail: wiedeh@rpi.ed

    Correlations of computed Schottky constants (K{sub S}=[V''{sub Zn}][V{sub S}{sup ..}]) with structural and thermodynamic properties showed linear dependences of log K{sub S} on the lattice energies for the Zn-, Cd-, Hg-, Mg-, and Sr-chalcogenides and for the Na- and K-halides. These findings suggest a basic relation between the Schottky constants and the lattice energies for these families of compounds from different parts of the Periodic Table, namely, {Delta}H{sub T,L}{sup o}=-(2.303nRT log K{sub S})+2.303nRm{sub b}+2.303nRTi{sub b}. {Delta}H{sub T,L}{sup o} is the experimental (Born-Haber) lattice energy (enthalpy), n is a constant approximately equal to the formal valence (charge) of the material, m{submore » b} and i{sub b} are the slope and intercept, respectively, of the intercept b (of the log K{sub S} versus {Delta}H{sub L}{sup o} linear relation) versus the reciprocal temperature. The results of this work also provide an empirical correlation between the Gibbs free energy of vacancy formation and the lattice energy. - Graphical abstract: For the Zn-chalcogenides, the quantities n and I{sub e} are 2.007 and 650.3 kcal (2722 kJ), respectively. For the other groups of compounds, they are approximately equal to the formal valences and ionization energies of the metals: Log K{sub S{approx}}-(2.303nRT){sup -1} (0.99{Delta}H{sup o}{sub T,L}-I{sub e}).« less

  4. PHAGE FORMATION IN STAPHYLOCOCCUS MUSCAE CULTURES

    PubMed Central

    Price, Winston H.

    1949-01-01

    1. The total nucleic acid synthesized by normal and by infected S. muscae suspensions is approximately the same. This is true for either lag phase cells or log phase cells. 2. The amount of nucleic acid synthesized per cell in normal cultures increases during the lag period and remains fairly constant during log growth. 3. The amount of nucleic acid synthesized per cell by infected cells increases during the whole course of the infection. 4. Infected cells synthesize less RNA and more DNA than normal cells. The ratio of RNA/DNA is larger in lag phase cells than in log phase cells. 5. Normal cells release neither ribonucleic acid nor desoxyribonucleic acid into the medium. 6. Infected cells release both ribonucleic acid and desoxyribonucleic acid into the medium. The time and extent of release depend upon the physiological state of the cells. 7. Infected lag phase cells may or may not show an increased RNA content. They release RNA, but not DNA, into the medium well before observable cellular lysis and before any virus is liberated. At virus liberation, the cell RNA content falls to a value below that initially present, while DNA, which increased during infection falls to approximately the original value. 8. Infected log cells show a continuous loss of cell RNA and a loss of DNA a short time after infection. At the time of virus liberation the cell RNA value is well below that initially present and the cells begin to lyse. PMID:18139006

  5. WE-H-207A-03: The Universality of the Lognormal Behavior of [F-18]FLT PET SUV Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scarpelli, M; Eickhoff, J; Perlman, S

    Purpose: Log transforming [F-18]FDG PET standardized uptake values (SUVs) has been shown to lead to normal SUV distributions, which allows utilization of powerful parametric statistical models. This study identified the optimal transformation leading to normally distributed [F-18]FLT PET SUVs from solid tumors and offers an example of how normal distributions permits analysis of non-independent/correlated measurements. Methods: Forty patients with various metastatic diseases underwent up to six FLT PET/CT scans during treatment. Tumors were identified by nuclear medicine physician and manually segmented. Average uptake was extracted for each patient giving a global SUVmean (gSUVmean) for each scan. The Shapiro-Wilk test wasmore » used to test distribution normality. One parameter Box-Cox transformations were applied to each of the six gSUVmean distributions and the optimal transformation was found by selecting the parameter that maximized the Shapiro-Wilk test statistic. The relationship between gSUVmean and a serum biomarker (VEGF) collected at imaging timepoints was determined using a linear mixed effects model (LMEM), which accounted for correlated/non-independent measurements from the same individual. Results: Untransformed gSUVmean distributions were found to be significantly non-normal (p<0.05). The optimal transformation parameter had a value of 0.3 (95%CI: −0.4 to 1.6). Given the optimal parameter was close to zero (which corresponds to log transformation), the data were subsequently log transformed. All log transformed gSUVmean distributions were normally distributed (p>0.10 for all timepoints). Log transformed data were incorporated into the LMEM. VEGF serum levels significantly correlated with gSUVmean (p<0.001), revealing log-linear relationship between SUVs and underlying biology. Conclusion: Failure to account for correlated/non-independent measurements can lead to invalid conclusions and motivated transformation to normally distributed SUVs. The log transformation was found to be close to optimal and sufficient for obtaining normally distributed FLT PET SUVs. These transformations allow utilization of powerful LMEMs when analyzing quantitative imaging metrics.« less

  6. First Test of Stochastic Growth Theory for Langmuir Waves in Earth's Foreshock

    NASA Technical Reports Server (NTRS)

    Cairns, Iver H.; Robinson, P. A.

    1997-01-01

    This paper presents the first test of whether stochastic growth theory (SGT) can explain the detailed characteristics of Langmuir-like waves in Earth's foreshock. A period with unusually constant solar wind magnetic field is analyzed. The observed distributions P(logE) of wave fields E for two intervals with relatively constant spacecraft location (DIFF) are shown to agree well with the fundamental prediction of SGT, that P(logE) is Gaussian in log E. This stochastic growth can be accounted for semi-quantitatively in terms of standard foreshock beam parameters and a model developed for interplanetary type III bursts. Averaged over the entire period with large variations in DIFF, the P(logE) distribution is a power-law with index approximately -1; this is interpreted in terms of convolution of intrinsic, spatially varying P(logE) distributions with a probability function describing ISEE's residence time at a given DIFF. Wave data from this interval thus provide good observational evidence that SGT can sometimes explain the clumping, burstiness, persistence, and highly variable fields of the foreshock Langmuir-like waves.

  7. Behavior of 11 Foodborne Bacteria on Whole and Cut Mangoes var. Ataulfo and Kent and Antibacterial Activities of Hibiscus sabdariffa Extracts and Chemical Sanitizers Directly onto Mangoes Contaminated with Foodborne Bacteria.

    PubMed

    Rangel-Vargas, Esmeralda; Luna-Rojo, Anais M; Cadena-Ramírez, Arturo; Torres-Vitela, Refugio; Gómez-Aldapa, Carlos A; Villarruel-López, Angélica; Téllez-Jurado, Alejandro; Villagómez-Ibarra, José R; Reynoso-Camacho, Rosalía; Castro-Rosas, Javier

    2018-05-01

    The behavior of foodborne bacteria on whole and cut mangoes and the antibacterial effect of Hibiscus sabdariffa calyx extracts and chemical sanitizers against foodborne bacteria on contaminated mangoes were investigated. Mangoes var. Ataulfo and Kent were used in the study. Mangoes were inoculated with Listeria monocytogenes, Shigella flexneri, Salmonella Typhimurium, Salmonella Typhi, Salmonella Montevideo, Escherichia coli strains (O157:H7, non-O157:H7 Shiga toxin-producing, enteropathogenic, enterotoxigenic, enteroinvasive, and enteroaggregative). The antibacterial effect of five roselle calyx extracts (water, ethanol, methanol, acetone, and ethyl acetate), sodium hypochlorite, colloidal silver, and acetic acid against foodborne bacteria were evaluated on contaminated mangoes. The dry extracts obtained with ethanol, methanol, acetone, and ethyl acetate were analyzed by nuclear magnetic resonance spectroscopy to determine solvent residues. Separately, contaminated whole mangoes were immersed in five hibiscus extracts and in sanitizers for 5 min. All foodborne bacteria attached to mangoes. After 20 days at 25 ± 2°C, all foodborne bacterial strains on whole Ataulfo mangoes had decreased by approximately 2.5 log, and on Kent mangoes by approximately 2 log; at 3 ± 2°C, they had decreased to approximately 1.9 and 1.5 log, respectively, on Ataulfo and Kent. All foodborne bacterial strains grew on cut mangoes at 25 ± 2°C; however, at 3 ± 2°C, bacterial growth was inhibited. Residual solvents were not detected in any of the dry extracts by nuclear magnetic resonance. Acetonic, ethanolic, and methanolic roselle calyx extracts caused a greater reduction in concentration (2 to 2.6 log CFU/g) of all foodborne bacteria on contaminated whole mangoes than the sodium hypochlorite, colloidal silver, and acetic acid. Dry roselle calyx extracts may be a potentially useful addition to disinfection procedures of mangoes.

  8. Wealth and price distribution by diffusive approximation in a repeated prediction market

    NASA Astrophysics Data System (ADS)

    Bottazzi, Giulio; Giachini, Daniele

    2017-04-01

    The approximate agents' wealth and price invariant densities of a repeated prediction market model is derived using the Fokker-Planck equation of the associated continuous-time jump process. We show that the approximation obtained from the evolution of log-wealth difference can be reliably exploited to compute all the quantities of interest in all the acceptable parameter space. When the risk aversion of the trader is high enough, we are able to derive an explicit closed-form solution for the price distribution which is asymptotically correct.

  9. The effect of pH and chloride concentration on the stability and antimicrobial activity of chlorine-based sanitizers.

    PubMed

    Waters, Brian W; Hung, Yen-Con

    2014-04-01

    Chlorinated water and electrolyzed oxidizing (EO) water solutions were made to compare the free chlorine stability and microbicidal efficacy of chlorine-containing solutions with different properties. Reduction of Escherichia coli O157:H7 was greatest in fresh samples (approximately 9.0 log CFU/mL reduction). Chlorine loss in "aged" samples (samples left in open bottles) was greatest (approximately 40 mg/L free chlorine loss in 24 h) in low pH (approximately 2.5) and high chloride (Cl(-) ) concentrations (greater than 150 mg/L). Reduction of E. coli O157:H7 was also negatively impacted (<1.0 log CFU/mL reduction) in aged samples with a low pH and high Cl(-) . Higher pH values (approximately 6.0) did not appear to have a significant effect on free chlorine loss or numbers of surviving microbial cells when fresh and aged samples were compared. This study found chloride levels in the chlorinated and EO water solutions had a reduced effect on both free chlorine stability and its microbicidal efficacy in the low pH solutions. Greater concentrations of chloride in pH 2.5 samples resulted in decreased free chlorine stability and lower microbicidal efficacy. © 2014 Institute of Food Technologists®

  10. Radiotracer properties determined by high performance liquid chromatography: a potential tool for brain radiotracer discovery.

    PubMed

    Tavares, Adriana Alexandre S; Lewsey, James; Dewar, Deborah; Pimlott, Sally L

    2012-01-01

    Previously, development of novel brain radiotracers has largely relied on simple screening tools. Improved selection methods at the early stages of radiotracer discovery and an increased understanding of the relationships between in vitro physicochemical and in vivo radiotracer properties are needed. We investigated if high performance liquid chromatography (HPLC) methodologies could provide criteria for lead candidate selection by comparing HPLC measurements with radiotracer properties in humans. Ten molecules, previously used as radiotracers in humans, were analysed to obtain the following measures: partition coefficient (Log P); permeability (P(m)); percentage of plasma protein binding (%PPB); and membrane partition coefficient (K(m)). Relationships between brain entry measurements (Log P, P(m) and %PPB) and in vivo brain percentage injected dose (%ID); and K(m) and specific binding in vivo (BP(ND)) were investigated. Log P values obtained using in silico packages and flask methods were compared with Log P values obtained using HPLC. The modelled associations with %ID were stronger for %PPB (r(2)=0.65) and P(m) (r(2)=0.77) than for Log P (r(2)=0.47) while 86% of BP(ND) variance was explained by K(m). Log P values were variable dependant on the methodology used. Log P should not be relied upon as a predictor of blood-brain barrier penetration during brain radiotracer discovery. HPLC measurements of permeability, %PPB and membrane interactions may be potentially useful in predicting in vivo performance and hence allow evaluation and ranking of compound libraries for the selection of lead radiotracer candidates at early stages of radiotracer discovery. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Post-fire land management: Comparative effects of different strategies on hillslope sediment yield

    NASA Astrophysics Data System (ADS)

    Cole, R.; Bladon, K. D.; Wagenbrenner, J.; Coe, D. B. R.

    2017-12-01

    High-severity wildfire can increase erosion on burned, forested hillslopes. Salvage logging is a post-fire land management practice to extract economic value from burned landscapes, reduce fuel loads, and improve forest safety. Few studies assess the impact of post-fire salvage logging or alternative land management approaches on erosion in forested landscapes, especially in California. In September 2015, the Valley Fire burned approximately 31,366 ha of forested land and wildland-urban interface in the California's Northern Coast Range, including most of Boggs Mountain Demonstration State Forest. The primary objective of our study is to quantify erosion rates at the plot scale ( 75 m2) for different post-fire land management practices, including mechanical logging and subsoiling (or ripping) after logging. We measured sediment yields using sediment fences in four sets of replicated plots. We also estimated ground cover in each plot using three randomly positioned 1-meter quadrats. We are also measuring rainfall near each plot to understand hydrologic factors that influence erosion. Preliminary results indicate that burned, unlogged reference plots yielded the most sediment over the winter rainy season (3.3 kg m-2). Sediment yields of burned and logged (0.9 kg m-2), and burned, logged, and ripped (0.7 kg m-2), were substantially lower. Burned and unlogged reference plots had the least ground cover (49%), while ground cover was higher and more similar between logged (65%) and logged and ripped (72%) plots. These initial results contrast with previous studies in which the effect of post-fire salvage logging ranged from no measured impact to increased sediment yield related to salvage logging.

  12. Species characterization and responses of subcortical insects to trap-logs and ethanol in a hardwood biomass plantation: Subcortical insects in hardwood plantations

    DOE PAGES

    Coyle, David R.; Brissey, Courtney L.; Gandhi, Kamal J. K.

    2015-01-02

    1. We characterized subcortical insect assemblages in economically important eastern cottonwood (Populus deltoides Bartr.), sycamore (Platanus occidentalis L.) and sweetgum (Liquidambar styraciflua L.) plantations in the southeastern U.S.A. Furthermore, we compared insect responses between freshly-cut plant material by placing traps directly over cut hardwood logs (trap-logs), traps baited with ethanol lures and unbaited (control) traps. 2. We captured a total of 15 506 insects representing 127 species in four families in 2011 and 2013. Approximately 9% and 62% of total species and individuals, respectively, and 23% and 79% of total Scolytinae species and individuals, respectively, were non-native to North America.more » 3. We captured more Scolytinae using cottonwood trap-logs compared with control traps in both years, although this was the case with sycamore and sweetgum only in 2013. More woodborers were captured using cottonwood and sweetgum trap-logs compared with control traps in both years, although only with sycamore in 2013. 4. Ethanol was an effective lure for capturing non-native Scolytinae; however, not all non-native species were captured using ethanol lures. Ambrosiophilus atratus (Eichhoff) and Hypothenemus crudiae (Panzer) were captured with both trap-logs and control traps, whereas Coccotrypes distinctus (Motschulsky) and Xyleborus glabratus Eichhoff were only captured on trap-logs. 5. Indicator species analysis revealed that certain scolytines [e.g. Cnestus mutilates (Blandford) and Xylosandrus crassiusculus (Motschulsky)] showed significant associations with trap-logs or ethanol baits in poplar or sweetgum trap-logs. In general, the species composition of subcortical insects, especially woodboring insects, was distinct among the three tree species and between those associated with trap-logs and control traps.« less

  13. Species characterization and responses of subcortical insects to trap-logs and ethanol in a hardwood biomass plantation: Subcortical insects in hardwood plantations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coyle, David R.; Brissey, Courtney L.; Gandhi, Kamal J. K.

    1. We characterized subcortical insect assemblages in economically important eastern cottonwood (Populus deltoides Bartr.), sycamore (Platanus occidentalis L.) and sweetgum (Liquidambar styraciflua L.) plantations in the southeastern U.S.A. Furthermore, we compared insect responses between freshly-cut plant material by placing traps directly over cut hardwood logs (trap-logs), traps baited with ethanol lures and unbaited (control) traps. 2. We captured a total of 15 506 insects representing 127 species in four families in 2011 and 2013. Approximately 9% and 62% of total species and individuals, respectively, and 23% and 79% of total Scolytinae species and individuals, respectively, were non-native to North America.more » 3. We captured more Scolytinae using cottonwood trap-logs compared with control traps in both years, although this was the case with sycamore and sweetgum only in 2013. More woodborers were captured using cottonwood and sweetgum trap-logs compared with control traps in both years, although only with sycamore in 2013. 4. Ethanol was an effective lure for capturing non-native Scolytinae; however, not all non-native species were captured using ethanol lures. Ambrosiophilus atratus (Eichhoff) and Hypothenemus crudiae (Panzer) were captured with both trap-logs and control traps, whereas Coccotrypes distinctus (Motschulsky) and Xyleborus glabratus Eichhoff were only captured on trap-logs. 5. Indicator species analysis revealed that certain scolytines [e.g. Cnestus mutilates (Blandford) and Xylosandrus crassiusculus (Motschulsky)] showed significant associations with trap-logs or ethanol baits in poplar or sweetgum trap-logs. In general, the species composition of subcortical insects, especially woodboring insects, was distinct among the three tree species and between those associated with trap-logs and control traps.« less

  14. Sampling-free Bayesian inversion with adaptive hierarchical tensor representations

    NASA Astrophysics Data System (ADS)

    Eigel, Martin; Marschall, Manuel; Schneider, Reinhold

    2018-03-01

    A sampling-free approach to Bayesian inversion with an explicit polynomial representation of the parameter densities is developed, based on an affine-parametric representation of a linear forward model. This becomes feasible due to the complete treatment in function spaces, which requires an efficient model reduction technique for numerical computations. The advocated perspective yields the crucial benefit that error bounds can be derived for all occuring approximations, leading to provable convergence subject to the discretization parameters. Moreover, it enables a fully adaptive a posteriori control with automatic problem-dependent adjustments of the employed discretizations. The method is discussed in the context of modern hierarchical tensor representations, which are used for the evaluation of a random PDE (the forward model) and the subsequent high-dimensional quadrature of the log-likelihood, alleviating the ‘curse of dimensionality’. Numerical experiments demonstrate the performance and confirm the theoretical results.

  15. Effectiveness of irradiation treatments in inactivating Listeria monocytogenes on fresh vegetables at refrigeration temperature.

    PubMed

    Bari, M L; Nakauma, M; Todoriki, S; Juneja, Vijay K; Isshiki, K; Kawamoto, S

    2005-02-01

    Ionizing radiation can be effective in controlling the growth of food spoilage and foodborne pathogenic bacteria. This study reports on an investigation of the effectiveness of irradiation treatment to eliminate Listeria monocytogenes on laboratory-inoculated broccoli, cabbage, tomatoes, and mung bean sprouts. Irradiation of broccoli and mung bean sprouts at 1.0 kGy resulted in reductions of approximately 4.88 and 4.57 log CFU/g, respectively, of a five-strain cocktail of L. monocytogenes. Reductions of approximately 5.25 and 4.14 log CFU/g were found with cabbage and tomato, respectively, at a similar dose. The appearance, color, texture, taste, and overall acceptability did not undergo significant changes after 7 days of postirradiation storage at 4 degrees C, in comparison with control samples. Therefore, low-dose ionizing radiation treatment could be an effective method for eliminating L. monocytogenes on fresh and fresh-cut produce.

  16. Disinfection of low quality wastewaters by ultraviolet irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zukovs, G.; Kollar, J.; Monteith, H.D.

    1986-03-01

    Pilot-scale disinfection of simulated combined sewer overflow (CSO) by ultraviolet light (UV) and by high-rate chlorination were compared. Disinfection efficiency was evaluated over a range of dosages and contact times for fecal coliforms, enterococci, P. Aeruginosa, and Salmonella spp. Fecal coliform were reduced 3.0 to 3.2 logs at a UV dose of approximately 350,000..mu.. W s/cm/sup 2/. High-rate chlorination, at a contact time of 2.0 minutes and total residual chlorine concentration of approximately 25 mg/L (as Cl/sub 2/), reduced fecal coliforms by 4.0 logs. Pathogens were reduced to detection limits by both processes. Neither photoreactivation nor regrowth occurred int hemore » disinfected effluents. The estimated capital costs of CSO disinfection by UV irradiation were consistently higher than for chlorination/dechlorination; operation and maintenance costs were similar. 19 references.« less

  17. Changes in Florida's industrial roundwood products output, 1977-1987

    Treesearch

    Edgar L. Davenport; John B. Tansey

    1990-01-01

    Nearly 480 million cubic feet of industrial roundwood products were harvested from Florida's forests during 1987, 48 percent more than in 1977. Saw logs and pulpwood were the leading roundwood products, with pulpwood accounting for 60 percent and saw logs for 30 percent of the 1987 total output. Output of all major industrial roundwood products increased between...

  18. Industrial concessions, fires and air pollution in Equatorial Asia

    NASA Astrophysics Data System (ADS)

    Spracklen, D. V.; Reddington, C. L.; Gaveau, D. L. A.

    2015-09-01

    Forest and peatland fires in Indonesia emit large quantities of smoke leading to poor air quality across Equatorial Asia. Marlier et al (2015 Environ. Res. Lett. 10 085005) explore the contribution of fires occurring on oil palm, timber (wood pulp and paper) and natural forest logging concessions to smoke emissions and exposure of human populations to the resulting air pollution. They find that one third of the population exposure to smoke across Equatorial Asia is caused by fires in oil palm and timber concessions in Sumatra and Kalimantan. Logging concessions have substantially lower fire emissions, and contribute less to air quality degradation. This represents a compelling justification to prevent reclassification of logging concessions into oil palm or timber concessions after logging. This can be achieved by including logged forests in the Indonesian moratorium on new plantations in forested areas.

  19. BRIEF REPORT: A simple interpolation formula for the spectra of power-law and log potentials

    NASA Astrophysics Data System (ADS)

    Hall, Richard L.

    2000-06-01

    Non-relativistic potential models are considered of the pure power V(r) = sgn(q) r q and logarithmic V(r) = ln (r) types. It is shown that, from the spectral viewpoint, these potentials are actually in a single family. The log spectra can be obtained from the power spectra by the limit q→0 taken in a smooth representation Pnl(q) for the eigenvalues Enl(q). A simple approximation formula is developed which yields the first 30 eigenvalues with an error less than 0.04%.

  20. Calibration Tests of a German Log Rodmeter

    NASA Technical Reports Server (NTRS)

    Mottard, Elmo J.; Stillman, Everette R.

    1949-01-01

    A German log rodmeter of the pitot static type was calibrated in Langley tank no. 1 at speeds up to 34 knots and angles of yaw from 0 deg to plus or minus 10 3/4 degrees. The dynamic head approximated the theoretical head at 0 degrees yaw but decreased as the yaw was increased. The static head was negative and in general became more negative with increasing speed and yaw. Cavitation occurred at speeds above 31 knots at 0 deg yaw and 21 knots at 10 3/4 deg yaw.

  1. Emission Measure Distribution and Heating of Two Active Region Cores

    NASA Technical Reports Server (NTRS)

    Tripathi, Durgesh; Klimchuk, James A.; Mason, Helen E.

    2011-01-01

    Using data from the Extreme-ultraviolet Imaging Spectrometer aboard Hinode, we have studied the coronal plasma in the core of two active regions. Concentrating on the area between opposite polarity moss, we found emission measure distributions having an approximate power-law form EM/T(exp 2.4) from log T = 5.55 up to a peak at log T = 6.57. The observations are explained extremely well by a simple nanoflare model. However, in the absence of additional constraints, the observations could possibly also be explained by steady heating.

  2. Meta-analysis of Odds Ratios: Current Good Practices

    PubMed Central

    Chang, Bei-Hung; Hoaglin, David C.

    2016-01-01

    Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977

  3. Growth and survival of Salmonella in ground black pepper (Piper nigrum).

    PubMed

    Keller, Susanne E; VanDoren, Jane M; Grasso, Elizabeth M; Halik, Lindsay A

    2013-05-01

    A four serovar cocktail of Salmonella was inoculated into ground black pepper (Piper nigrum) at different water activity (aw) levels at a starting level of 4-5 log cfu/g and incubated at 25 and at 35 °C. At 35 °C and aw of 0.9886 ± 0.0006, the generation time in ground black pepper was 31 ± 3 min with a lag time of 4 ± 1 h. Growth at 25 °C had a longer lag, but generation time was not statistically different from growth at 35 °C. The aw threshold for growth was determined to be 0.9793 ± 0.0027 at 35 °C. To determine survival during storage conditions, ground black pepper was inoculated at approximately 8 log cfu/g and stored at 25 and 35 °C at high (97% RH) and ambient (≤40% RH) humidity. At high relative humidity, aw increased to approximately 0.8-0.9 after approximately 20 days at both temperatures and no Salmonella was detected after 100 and 45 days at 25 and 35 °C, respectively. Under ambient humidity, populations showed an initial decrease of 3-4 log cfu/g, then remained stable for over 8 months at 25 and 35 °C. Results of this study indicate Salmonella can readily grow at permissive aw in ground black pepper and may persist for an extended period of time under typical storage conditions. Published by Elsevier Ltd.

  4. Dynamic predictive model for the growth of Salmonella spp. in liquid whole egg.

    PubMed

    Singh, Aikansh; Korasapati, Nageswara R; Juneja, Vijay K; Subbiah, Jeyamkondan; Froning, Glenn; Thippareddi, Harshavardhan

    2011-04-01

    A dynamic model for the growth of Salmonella spp. in liquid whole egg (LWE) (approximately pH 7.8) under continuously varying temperature was developed. The model was validated using 2 (5 to 15 °C; 600 h and 10 to 40 °C; 52 h) sinusoidal, continuously varying temperature profiles. LWE adjusted to pH 7.8 was inoculated with approximately 2.5-3.0 log CFU/mL of Salmonella spp., and the growth data at several isothermal conditions (5, 7, 10, 15, 20, 25, 30, 35, 37, 39, 41, 43, 45, and 47 °C) was collected. A primary model (Baranyi model) was fitted for each temperature growth data and corresponding maximum growth rates were estimated. Pseudo-R2 values were greater than 0.97 for primary models. Modified Ratkowsky model was used to fit the secondary model. The pseudo-R2 and root mean square error were 0.99 and 0.06 log CFU/mL, respectively, for the secondary model. A dynamic model for the prediction of Salmonella spp. growth under varying temperature conditions was developed using 4th-order Runge-Kutta method. The developed dynamic model was validated for 2 sinusoidal temperature profiles, 5 to 15 °C (for 600 h) and 10 to 40 °C (for 52 h) with corresponding root mean squared error values of 0.28 and 0.23 log CFU/mL, respectively, between predicted and observed Salmonella spp. populations. The developed dynamic model can be used to predict the growth of Salmonella spp. in LWE under varying temperature conditions.   Liquid egg and egg products are widely used in food processing and in restaurant operations. These products can be contaminated with Salmonella spp. during breaking and other unit operations during processing. The raw, liquid egg products are stored under refrigeration prior to pasteurization. However, process deviations can occur such as refrigeration failure, leading to temperature fluctuations above the required temperatures as specified in the critical limits within hazard analysis and critical control point plans for the operations. The processors are required to evaluate the potential growth of Salmonella spp. in such products before the product can be used, or further processed. Dynamic predictive models are excellent tools for regulators as well as the processing plant personnel to evaluate the microbiological safety of the product under such conditions.

  5. Inactivation of Salmonella Enteritidis on lettuces used by minimally processed vegetable industries.

    PubMed

    Silveira, Josete Bailardi; Hessel, Claudia Titze; Tondo, Eduardo Cesar

    2017-01-30

    Washing and disinfection methods used by minimally processed vegetable industries of Southern Brazil were reproduced in laboratory in order to verify their effectiveness to reduce Salmonella Enteritidis SE86 (SE86) on lettuce. Among the five industries investigated, four carried out washing with potable water followed by disinfection with 200 ppm sodium hypochlorite during different immersion times. The washing procedure alone decreased approximately 1 log CFU/g of SE86 population and immersion times of 1, 2, 5, and 15 minutes in disinfectant solution demonstrated reduction rates ranging from 2.06±0.10 log CFU/g to 3.01±0.21 log CFU/g. Rinsing alone was able to reduce counts from 0.12±0.63 log CFU/g to 1.90±1.07 log CFU/g. The most effective method was washing followed by disinfection with 200 ppm sodium hypochlorite for 15 minutes and final rinse with potable water, reaching 5.83 log CFU/g of reduction. However, no statistical differences were observed on the reduction rates after different immersion times. A time interval of 1 to 2 minutes may be an advantage to the minimally vegetable processed industries in order to optimize the process without putting at risk food safety.

  6. Use of polynomial expressions to describe the bioconcentration of hydrophobic chemicals by fish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, D.W.; Hawker, D.W.

    1988-12-01

    For the bioconcentration of hydrophobic chemicals by fish, relationships have been previously established between uptake rate constants (k1) and the octanol/water partition coefficient (Kow), and also between the clearance rate constant (k2) and Kow. These have been refined and extended on the basis of data for chlorinated hydrocarbons, and closely related compounds including polychlorinated dibenzodioxins, that covered a wider range of hydrophobicity (2.5 less than log Kow less than 9.5). This has allowed the development of new relationships between log Kow and various factors, including the bioconcentration factor (as log KB), equilibrium time (as log teq), and maximum biotic concentrationmore » (as log CB), which include extremely hydrophobic compounds previously not taken into account. The shape of the curves generated by these equations are in qualitative agreement with theoretical prediction and are described by polynomial expressions which are generally approximately linear over the more limited range of log Kow values used to develop previous relationships. The influences of factors such as hydrophobicity, aqueous solubility, molecular weight, lipid solubility, and also exposure time were considered. Decreasing lipid solubilities of extremely hydrophobic chemicals were found to result in increasing clearance rate constants, as well decreasing equilibrium times and bioconcentration factors.« less

  7. Speed, spatial, and temporal tuning of rod and cone vision in mouse.

    PubMed

    Umino, Yumiko; Solessio, Eduardo; Barlow, Robert B

    2008-01-02

    Rods and cones subserve mouse vision over a 100 million-fold range of light intensity (-6 to 2 log cd m(-2)). Rod pathways tune vision to the temporal frequency of stimuli (peak, 0.75 Hz) and cone pathways to their speed (peak, approximately 12 degrees/s). Both pathways tune vision to the spatial components of stimuli (0.064-0.128 cycles/degree). The specific photoreceptor contributions were determined by two-alternative, forced-choice measures of contrast thresholds for optomotor responses of C57BL/6J mice with normal vision, Gnat2(cpfl3) mice without functional cones, and Gnat1-/- mice without functional rods. Gnat2(cpfl3) mice (threshold, -6.0 log cd m(-2)) cannot see rotating gratings above -2.0 log cd m(-2) (photopic vision), and Gnat1-/- mice (threshold, -4.0 log cd m(-2)) are blind below -4.0 log cd m(-2) (scotopic vision). Both genotypes can see in the transitional mesopic range (-4.0 to -2.0 log cd m(-2)). Mouse rod and cone sensitivities are similar to those of human. This parametric study characterizes the functional properties of the mouse visual system, revealing the rod and cone contributions to contrast sensitivity and to the temporal processing of visual stimuli.

  8. Analisis fotometrico del cumulo abierto NGC 6611

    NASA Astrophysics Data System (ADS)

    Suarez Nunez, Johanna

    2007-08-01

    Matlab programs were designed to apply differential aperture photometry. Two images were taken with a charge-couple device ( CCD ) in the visible V and blue filters, to calculate physical parameters (the flux( f ), the apparent magnitude ( m V ) and its reddening corrected value ( V 0 ), color index ( B- V ) and ( B-V ) 0 , the log of effective temperature (log T eff ), the absolute magnitude ( M V ), the bolometric magnitude ( M B ) & log(L [low *] /[Special characters omitted.] )) of each studied star pertaining to the open cluster NGC 6611. Upon obtaining the parameters, the color-magnitude diagram was graphed and by fitting to the main sequence, the distance modulus and thus the distance to the cluster was found. The stars were assumed to be at the same distance and born at approximately the same moment.

  9. Micromechanical and Electrical Properties of Monolithic Aluminum Nitride at High Temperatures

    NASA Technical Reports Server (NTRS)

    Goldsby, Jon C.

    2000-01-01

    Micromechanical spectroscopy of aluminum nitride reveals it to possess extremely low background internal friction at less than 1x10(exp-4) logarithmic decrement (log dec) from 20 to 1200 T. Two mechanical loss peaks were observed, the first at 350 C approximating a single Debye peak with a peak height of 60x10(exp-4) log dec. The second peak was seen at 950 'C with a peak height of 20x 10' log dec and extended from 200 to over 1200 C. These micromechanical observations manifested themselves in the electrical behavior of these materials. Electrical conduction processes were predominately intrinsic. Both mechanical and electrical relaxations appear to be thermally activated processes, with activation energies of 0.78 and 1.32 eV respectively.

  10. Micromechanical and Electrical Properties of Monolithic Aluminum Nitride at High Temperatures

    NASA Technical Reports Server (NTRS)

    Goldsby, Jon C.

    2001-01-01

    Micromechanical spectroscopy of aluminum nitride reveals it to possess extremely low background internal friction at less than 1 x 10 (exp -4) logarithmic decrement (log dec.) from 20 to 1200 C. Two mechanical loss peaks were observed, the first at 350 C approximating a single Debye peak with a peak height of 60 x 10 (exp -4) log dec. The second peak was seen at 950 C with a peak height of 20 x 10 (exp -4) log dec. and extended from 200 to over 1200 C. These micromechanical observations manifested themselves in the electrical behavior of these materials. Electrical conduction processes were predominately intrinsic. Both mechanical and electrical relaxations appear to be thermally activated processes, with activation energies of 0.78 and 1.32 eV respectively.

  11. Abundance and Morphological Effects of Large Woody Debris in Forested Basins of Southern Andes

    NASA Astrophysics Data System (ADS)

    Andreoli, A.; Comiti, F.; Lenzi, M. A.

    2006-12-01

    The Southern Andes mountain range represents an ideal location for studying large woody debris (LWD) in streams draining forested basins thanks to the presence of both pristine and managed woodland, and to the general low level of human alteration of stream corridors. However, no published investigations have been performed so far in such a large region. The investigated sites of this research are three basins (9-13 km2 drainage area, third-order channels) covered by Nothofagus forests: two of them are located in the Southern Chilean Andes (the Tres Arroyos in the Malalcahuello National Reserve and the Rio Toro within the Malleco Natural Reserve) and one basin lies in the Argentinean Tierra del Fuego (the Buena Esperanza basin, near the city of Ushuaia). Measured LWD were all wood pieces larger than 10 cm in diameter and 1 m in length, both in the active channel and in the adjacent active floodplain. Pieces forming log jams were all measured and the geometrical dimensions of jams were taken. Jam type was defined based on Abbe and Montgomery (2003) classification. Sediment stored behind log-steps and valley jams was evaluated approximating the sediment accumulated to a solid wedge whose geometrical dimensions were measured. Additional information relative to each LWD piece were recorded during the field survey: type (log, rootwad, log with rootwads attached), orientation to flow, origin (floated, bank erosion, landslide, natural mortality, harvest residuals) and position (log-step, in-channel, channel-bridging, channel margins, bankfull edge). In the Tres Arroyos, the average LWD volume stored within the bankfull channel is 710 m3 ha-1. The average number of pieces is 1,004 per hectare of bankfull channel area. Log-steps represent about 22% of all steps, whereas the elevation loss due to LWD (log-steps and valley jams) results in 27% loss of the total stream potential energy. About 1,600 m3 of sediment (assuming a porosity of 20%) is stored in the main channel behind LWD structures approximately, i.e. 1,000 m3 per km of channel length, corresponding to approximately 150% of the annual sediment yield. In the Rio Toro, the average LWD volume and number of elements stored are much less, respectively 117 m3 ha-1 and 215 pieces ha-1. Neither log-steps or valley jams were observed and the longitudinal profile appear not affected by LWD, and no sediment storage can be attributed to woody debris. The low LWD storage and impact in this channel is likely due to the general stability of its hillslopes, in contrast to the Tres Arroyos where extensive landslides and debris flows convey a great deal of wood into the stream. Finally, in the Buena Esperanza, the average LWD volume stored in the active channel is quite low (120 m3 ha-1, but the average number of pieces is the highest with 1,397 pieces ha-1. This is due to the smaller dimensions of LWD elements delivered by trees growing in a colder climate as that characterizing the Tierra del Fuego. The morphological influence of wood in this channel is however very important, with the presence of large valley jams and high log-steps imparting the channel a macro-scale stepped profile with a total energy dissipation due to LWD (log-steps and valley jams) of about 24 % of the stream potential energy. The sediment stored behind log-steps and valley jams results to be about 1,290 m3, i.e. 700 m3 km-1, but unfortunately no values of sediment yields are available for this basin.

  12. Interactions among Amazon land use, forests and climate: prospects for a near-term forest tipping point.

    PubMed

    Nepstad, Daniel C; Stickler, Claudia M; Filho, Britaldo Soares-; Merry, Frank

    2008-05-27

    Some model experiments predict a large-scale substitution of Amazon forest by savannah-like vegetation by the end of the twenty-first century. Expanding global demands for biofuels and grains, positive feedbacks in the Amazon forest fire regime and drought may drive a faster process of forest degradation that could lead to a near-term forest dieback. Rising worldwide demands for biofuel and meat are creating powerful new incentives for agro-industrial expansion into Amazon forest regions. Forest fires, drought and logging increase susceptibility to further burning while deforestation and smoke can inhibit rainfall, exacerbating fire risk. If sea surface temperature anomalies (such as El Niño episodes) and associated Amazon droughts of the last decade continue into the future, approximately 55% of the forests of the Amazon will be cleared, logged, damaged by drought or burned over the next 20 years, emitting 15-26Pg of carbon to the atmosphere. Several important trends could prevent a near-term dieback. As fire-sensitive investments accumulate in the landscape, property holders use less fire and invest more in fire control. Commodity markets are demanding higher environmental performance from farmers and cattle ranchers. Protected areas have been established in the pathway of expanding agricultural frontiers. Finally, emerging carbon market incentives for reductions in deforestation could support these trends.

  13. A stable computation of log-derivatives from noisy drawdown data

    NASA Astrophysics Data System (ADS)

    Ramos, Gustavo; Carrera, Jesus; Gómez, Susana; Minutti, Carlos; Camacho, Rodolfo

    2017-09-01

    Pumping tests interpretation is an art that involves dealing with noise coming from multiple sources and conceptual model uncertainty. Interpretation is greatly helped by diagnostic plots, which include drawdown data and their derivative with respect to log-time, called log-derivative. Log-derivatives are especially useful to complement geological understanding in helping to identify the underlying model of fluid flow because they are sensitive to subtle variations in the response to pumping of aquifers and oil reservoirs. The main problem with their use lies in the calculation of the log-derivatives themselves, which may display fluctuations when data are noisy. To overcome this difficulty, we propose a variational regularization approach based on the minimization of a functional consisting of two terms: one ensuring that the computed log-derivatives honor measurements and one that penalizes fluctuations. The minimization leads to a diffusion-like differential equation in the log-derivatives, and boundary conditions that are appropriate for well hydraulics (i.e., radial flow, wellbore storage, fractal behavior, etc.). We have solved this equation by finite differences. We tested the methodology on two synthetic examples showing that a robust solution is obtained. We also report the resulting log-derivative for a real case.

  14. The Suzaku View of Highly Ionized Outflows in AGN. 1; Statistical Detection and Global Absorber Properties

    NASA Technical Reports Server (NTRS)

    Gofford, Jason; Reeves, James N.; Tombesi, Francesco; Braito, Valentina; Turner, T. Jane; Miller, Lance; Cappi, Massimo

    2013-01-01

    We present the results of a new spectroscopic study of Fe K-band absorption in active galactic nuclei (AGN). Using data obtained from the Suzaku public archive we have performed a statistically driven blind search for Fe XXV Healpha and/or Fe XXVI Lyalpha absorption lines in a large sample of 51 Type 1.0-1.9 AGN. Through extensive Monte Carlo simulations we find that statistically significant absorption is detected at E greater than or approximately equal to 6.7 keV in 20/51 sources at the P(sub MC) greater than or equal tov 95 per cent level, which corresponds to approximately 40 per cent of the total sample. In all cases, individual absorption lines are detected independently and simultaneously amongst the two (or three) available X-ray imaging spectrometer detectors, which confirms the robustness of the line detections. The most frequently observed outflow phenomenology consists of two discrete absorption troughs corresponding to Fe XXV Healpha and Fe XXVI Lyalpha at a common velocity shift. From xstar fitting the mean column density and ionization parameter for the Fe K absorption components are log (N(sub H) per square centimeter)) is approximately equal to 23 and log (Xi/erg centimeter per second) is approximately equal to 4.5, respectively. Measured outflow velocities span a continuous range from less than1500 kilometers per second up to approximately100 000 kilometers per second, with mean and median values of approximately 0.1 c and approximately 0.056 c, respectively. The results of this work are consistent with those recently obtained using XMM-Newton and independently provides strong evidence for the existence of very highly ionized circumnuclear material in a significant fraction of both radio-quiet and radio-loud AGN in the local universe.

  15. Nested taxa-area curves for eastern United States floras

    USGS Publications Warehouse

    Bennett, J.P.

    1997-01-01

    The slopes of log-log species-area curves have been studied extensively and found to be influenced by the range of areas under study. Two such studies of eastern United States floras have yielded species-area curve slopes which differ by more than 100%: 0.251 and 0.113. The first slope may be too steep because the flora of the world was included, and both may be too steep because noncontiguous areas were used. These two hypotheses were tested using a set of nested floras centered in Ohio and continuing up to the flora of the world. The results suggest that this set of eastern United States floras produces a log-log species-area curve with a slope of approximately 0.20 with the flora of the world excluded, and regardless of whether or not the floras are from nested areas. Genera- and family-area curves are less steep than species-area curves and show similar patterns. Taxa ratio curves also increase with area, with the species/family ratio showing the steepest slope.

  16. Entropy of orthogonal polynomials with Freud weights and information entropies of the harmonic oscillator potential

    NASA Astrophysics Data System (ADS)

    Van Assche, W.; Yáñez, R. J.; Dehesa, J. S.

    1995-08-01

    The information entropy of the harmonic oscillator potential V(x)=1/2λx2 in both position and momentum spaces can be expressed in terms of the so-called ``entropy of Hermite polynomials,'' i.e., the quantity Sn(H):= -∫-∞+∞H2n(x)log H2n(x) e-x2dx. These polynomials are instances of the polynomials orthogonal with respect to the Freud weights w(x)=exp(-||x||m), m≳0. Here, a very precise and general result of the entropy of Freud polynomials recently established by Aptekarev et al. [J. Math. Phys. 35, 4423-4428 (1994)], specialized to the Hermite kernel (case m=2), leads to an important refined asymptotic expression for the information entropies of very excited states (i.e., for large n) in both position and momentum spaces, to be denoted by Sρ and Sγ, respectively. Briefly, it is shown that, for large values of n, Sρ+1/2logλ≂log(π√2n/e)+o(1) and Sγ-1/2log λ≂log(π√2n/e)+o(1), so that Sρ+Sγ≂log(2π2n/e2)+o(1) in agreement with the generalized indetermination relation of Byalinicki-Birula and Mycielski [Commun. Math. Phys. 44, 129-132 (1975)]. Finally, the rate of convergence of these two information entropies is numerically analyzed. In addition, using a Rakhmanov result, we describe a totally new proof of the leading term of the entropy of Freud polynomials which, naturally, is just a weak version of the aforementioned general result.

  17. A new method for correlation analysis of compositional (environmental) data - a worked example.

    PubMed

    Reimann, C; Filzmoser, P; Hron, K; Kynčlová, P; Garrett, R G

    2017-12-31

    Most data in environmental sciences and geochemistry are compositional. Already the unit used to report the data (e.g., μg/l, mg/kg, wt%) implies that the analytical results for each element are not free to vary independently of the other measured variables. This is often neglected in statistical analysis, where a simple log-transformation of the single variables is insufficient to put the data into an acceptable geometry. This is also important for bivariate data analysis and for correlation analysis, for which the data need to be appropriately log-ratio transformed. A new approach based on the isometric log-ratio (ilr) transformation, leading to so-called symmetric coordinates, is presented here. Summarizing the correlations in a heat-map gives a powerful tool for bivariate data analysis. Here an application of the new method using a data set from a regional geochemical mapping project based on soil O and C horizon samples is demonstrated. Differences to 'classical' correlation analysis based on log-transformed data are highlighted. The fact that some expected strong positive correlations appear and remain unchanged even following a log-ratio transformation has probably led to the misconception that the special nature of compositional data can be ignored when working with trace elements. The example dataset is employed to demonstrate that using 'classical' correlation analysis and plotting XY diagrams, scatterplots, based on the original or simply log-transformed data can easily lead to severe misinterpretations of the relationships between elements. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Effects of selective logging on bat communities in the southeastern Amazon.

    PubMed

    Peters, Sandra L; Malcolm, Jay R; Zimmerman, Barbara L

    2006-10-01

    Although extensive areas of tropical forest are selectively logged each year, the responses of bat communities to this form of disturbance have rarely been examined. Our objectives were to (1) compare bat abundance, species composition, and feeding guild structure between unlogged and low-intensity selectively logged (1-4 logged stems/ha) sampling grids in the southeastern Amazon and (2) examine correlations between logging-induced changes in bat communities and forest structure. We captured bats in understory and canopy mist nets set in five 1-ha study grids in both logged and unlogged forest. We captured 996 individuals, representing 5 families, 32 genera, and 49 species. Abundances of nectarivorous and frugivorous taxa (Glossophaginae, Lonchophyllinae, Stenodermatinae, and Carolliinae) were higher at logged sites, where canopy openness and understory foliage density were greatest. In contrast, insectivorous and omnivorous species (Emballonuridae, Mormoopidae, Phyllostominae, and Vespertilionidae) were more abundant in unlogged sites, where canopy foliage density and variability in the understory stratum were greatest. Multivariate analyses indicated that understory bat species composition differed strongly between logged and unlogged sites but provided little evidence of logging effects for the canopy fauna. Different responses among feeding guilds and taxonomic groups appeared to be related to foraging and echolocation strategies and to changes in canopy cover and understory foliage densities. Our results suggest that even low-intensity logging modifies habitat structure, leading to changes in bat species composition.

  19. Pulsed light treatment for the inactivation of selected pathogens and the shelf-life extension of beef and tuna carpaccio.

    PubMed

    Hierro, Eva; Ganan, Monica; Barroso, Elvira; Fernández, Manuela

    2012-08-01

    The efficacy of pulsed light to improve the safety of carpaccio has been investigated. Beef and tuna slices were superficially inoculated with approximately 3 log cfu/cm2 of Listeria monocytogenes, Escherichia coli, Salmonella Typhimurium and Vibrio parahaemolyticus. Fluences of 0.7, 2.1, 4.2, 8.4 and 11.9 J/cm2 were assayed. Colour, sensory and shelf-life studies were carried out. Treatments at 8.4 and 11.9 J/cm2 inactivated the selected pathogens approximately by 1 log cfu/cm2, although they modified the colour parameters and had a negative effect on the sensory quality of the product. The raw attributes were not affected by fluences of 2.1 and 4.2J/cm2 immediately after the treatment, although changes were observed during storage. The inactivation obtained with these fluences was lower than 1 log cfu/cm2, which may not be negligible in case of cross-contamination at a food plant or at a food service facility. Pulsed light showed a greater impact on the sensory quality of tuna carpaccio compared to beef. None of the fluences assayed extended the shelf-life of either product. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Application of ozonated dry ice (ALIGAL™ Blue Ice) for packaging and transport in the food industry.

    PubMed

    Fratamico, Pina M; Juneja, Vijay; Annous, Bassam A; Rasanayagam, Vasuhi; Sundar, M; Braithwaite, David; Fisher, Steven

    2012-05-01

    Dry ice is used by meat and poultry processors for temperature reduction during processing and for temperature maintenance during transportation. ALIGAL™ Blue Ice (ABI), which combines the antimicrobial effect of ozone (O(3)) along with the high cooling capacity of dry ice, was investigated for its effect on bacterial reduction in air, in liquid, and on food and glass surfaces. Through proprietary means, O(3) was introduced to produce dry ice pellets to a concentration of 20 parts per million (ppm) by total weight. The ABI sublimation rate was similar to that of dry ice pellets under identical conditions, and ABI was able to hold the O(3) concentration throughout the normal shelf life of the product. Challenge studies were performed using different microorganisms, including E. coli, Campylobacter jejuni, Salmonella, and Listeria, that are critical to food safety. ABI showed significant (P < 0.05) microbial reduction during bioaerosol contamination (up to 5-log reduction of E. coli and Listeria), on chicken breast (approximately 1.3-log reduction of C. jejuni), on contact surfaces (approximately 3.9 log reduction of C. jejuni), and in liquid (2-log reduction of C. jejuni). Considering the stability of O(3), ease of use, and antimicrobial efficacy against foodborne pathogens, our results suggest that ABI is a better alternative, especially for meat and poultry processors, as compared to dry ice. Further, ABI can potentially serve as an additional processing hurdle to guard against pathogens during processing, transportation, distribution, and/or storage. © 2012 Institute of Food Technologists®

  1. "How Science Works" and Data Logging: Eleven Quick Experiments with a Kettle

    ERIC Educational Resources Information Center

    Walker, Justin

    2010-01-01

    The benefits of using data logging to teach "how science works" are presented. Pedagogical approaches that take advantage of other school ICT are briefly described. A series of simple, quick experiments are given together with their resulting charts. Examples of the questions that arise from the charts show how the rich data lead to the refinement…

  2. An analysis of production and costs in high-lead yarding.

    Treesearch

    Magnus E. Tennas; Robert H. Ruth; Carl M. Berntsen

    1955-01-01

    In recent years loggers and timber owners have needed better information for estimating logging costs in the Douglas-fir region. Brandstrom's comprehensive study, published in 1933 (1), has long been used as a guide in making cost estimates. But the use of new equipment and techniques and an overall increase in logging costs have made it increasingly difficult to...

  3. Arkansas' timber industry - an assessment of timber product output and use, 1996

    Treesearch

    Michael Howell; Robert Levins

    1998-01-01

    In 1996, roundwood output from Arkansas forests totaled 636 million cubic feet. Mill byproducts generated from primary manufacturers was 286 million cubic feet. Almost all plant residues were used primarily for fuel and fiber products. Saw logs were the leading roundwood product at 315 million cubic feet; pulpwood ranked second at 242 million cubic feet; veneer logs...

  4. Bacterial Hand Contamination and Transfer after Use of Contaminated Bulk-Soap-Refillable Dispensers▿†

    PubMed Central

    Zapka, Carrie A.; Campbell, Esther J.; Maxwell, Sheri L.; Gerba, Charles P.; Dolan, Michael J.; Arbogast, James W.; Macinga, David R.

    2011-01-01

    Bulk-soap-refillable dispensers are prone to extrinsic bacterial contamination, and recent studies demonstrated that approximately one in four dispensers in public restrooms are contaminated. The purpose of this study was to quantify bacterial hand contamination and transfer after use of contaminated soap under controlled laboratory and in-use conditions in a community setting. Under laboratory conditions using liquid soap experimentally contaminated with 7.51 log10 CFU/ml of Serratia marcescens, an average of 5.28 log10 CFU remained on each hand after washing, and 2.23 log10 CFU was transferred to an agar surface. In an elementary-school-based field study, Gram-negative bacteria on the hands of students and staff increased by 1.42 log10 CFU per hand (26-fold) after washing with soap from contaminated bulk-soap-refillable dispensers. In contrast, washing with soap from dispensers with sealed refills significantly reduced bacteria on hands by 0.30 log10 CFU per hand (2-fold). Additionally, the mean number of Gram-negative bacteria transferred to surfaces after washing with soap from dispensers with sealed-soap refills (0.06 log10 CFU) was significantly lower than the mean number after washing with contaminated bulk-soap-refillable dispensers (0.74 log10 CFU; P < 0.01). Finally, significantly higher levels of Gram-negative bacteria were recovered from students (2.82 log10 CFU per hand) than were recovered from staff (2.22 log10 CFU per hand) after washing with contaminated bulk soap (P < 0.01). These results demonstrate that washing with contaminated soap from bulk-soap-refillable dispensers can increase the number of opportunistic pathogens on the hands and may play a role in the transmission of bacteria in public settings. PMID:21421792

  5. Bacterial hand contamination and transfer after use of contaminated bulk-soap-refillable dispensers.

    PubMed

    Zapka, Carrie A; Campbell, Esther J; Maxwell, Sheri L; Gerba, Charles P; Dolan, Michael J; Arbogast, James W; Macinga, David R

    2011-05-01

    Bulk-soap-refillable dispensers are prone to extrinsic bacterial contamination, and recent studies demonstrated that approximately one in four dispensers in public restrooms are contaminated. The purpose of this study was to quantify bacterial hand contamination and transfer after use of contaminated soap under controlled laboratory and in-use conditions in a community setting. Under laboratory conditions using liquid soap experimentally contaminated with 7.51 log(10) CFU/ml of Serratia marcescens, an average of 5.28 log(10) CFU remained on each hand after washing, and 2.23 log(10) CFU was transferred to an agar surface. In an elementary-school-based field study, Gram-negative bacteria on the hands of students and staff increased by 1.42 log(10) CFU per hand (26-fold) after washing with soap from contaminated bulk-soap-refillable dispensers. In contrast, washing with soap from dispensers with sealed refills significantly reduced bacteria on hands by 0.30 log(10) CFU per hand (2-fold). Additionally, the mean number of Gram-negative bacteria transferred to surfaces after washing with soap from dispensers with sealed-soap refills (0.06 log(10) CFU) was significantly lower than the mean number after washing with contaminated bulk-soap-refillable dispensers (0.74 log(10) CFU; P < 0.01). Finally, significantly higher levels of Gram-negative bacteria were recovered from students (2.82 log(10) CFU per hand) than were recovered from staff (2.22 log(10) CFU per hand) after washing with contaminated bulk soap (P < 0.01). These results demonstrate that washing with contaminated soap from bulk-soap-refillable dispensers can increase the number of opportunistic pathogens on the hands and may play a role in the transmission of bacteria in public settings.

  6. Effect of inoculum size, bacterial species, type of surfaces and contact time to the transfer of foodborne pathogens from inoculated to non-inoculated beef fillets via food processing surfaces.

    PubMed

    Gkana, E; Chorianopoulos, N; Grounta, A; Koutsoumanis, K; Nychas, G-J E

    2017-04-01

    The objective of the present study was to determine the factors affecting the transfer of foodborne pathogens from inoculated beef fillets to non-inoculated ones, through food processing surfaces. Three different levels of inoculation of beef fillets surface were prepared: a high one of approximately 10 7  CFU/cm 2 , a medium one of 10 5  CFU/cm 2 and a low one of 10 3  CFU/cm 2 , using mixed-strains of Listeria monocytogenes, or Salmonella enterica Typhimurium, or Escherichia coli O157:H7. The inoculated fillets were then placed on 3 different types of surfaces (stainless steel-SS, polyethylene-PE and wood-WD), for 1 or 15 min. Subsequently, these fillets were removed from the cutting boards and six sequential non-inoculated fillets were placed on the same surfaces for the same period of time. All non-inoculated fillets were contaminated with a progressive reduction trend of each pathogen's population level from the inoculated fillets to the sixth non-inoculated ones that got in contact with the surfaces, and regardless the initial inoculum, a reduction of approximately 2 log CFU/g between inoculated and 1st non-inoculated fillet was observed. S. Typhimurium was transferred at lower mean population (2.39 log CFU/g) to contaminated fillets than E. coli O157:H7 (2.93 log CFU/g), followed by L. monocytogenes (3.12 log CFU/g; P < 0.05). Wooden surfaces (2.77 log CFU/g) enhanced the transfer of bacteria to subsequent fillets compared to other materials (2.66 log CFU/g for SS and PE; P < 0.05). Cross-contamination between meat and surfaces is a multifactorial process strongly depended on the species, initial contamination level, kind of surface, contact time and the number of subsequent fillet, according to analysis of variance. Thus, quantifying the cross-contamination risk associated with various steps of meat processing and food establishments or households can provide a scientific basis for risk management of such products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Evaluation and validity of a LORETA normative EEG database.

    PubMed

    Thatcher, R W; North, D; Biver, C

    2005-04-01

    To evaluate the reliability and validity of a Z-score normative EEG database for Low Resolution Electromagnetic Tomography (LORETA), EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) were acquired from 106 normal subjects, and the cross-spectrum was computed and multiplied by the Key Institute's LORETA 2,394 gray matter pixel T Matrix. After a log10 transform or a Box-Cox transform the mean and standard deviation of the *.lor files were computed for each of the 2394 gray matter pixels, from 1 to 30 Hz, for each of the subjects. Tests of Gaussianity were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of a Z-score database was computed by measuring the approximation to a Gaussian distribution. The validity of the LORETA normative database was evaluated by the degree to which confirmed brain pathologies were localized using the LORETA normative database. Log10 and Box-Cox transforms approximated Gaussian distribution in the range of 95.64% to 99.75% accuracy. The percentage of normative Z-score values at 2 standard deviations ranged from 1.21% to 3.54%, and the percentage of Z-scores at 3 standard deviations ranged from 0% to 0.83%. Left temporal lobe epilepsy, right sensory motor hematoma and a right hemisphere stroke exhibited maximum Z-score deviations in the same locations as the pathologies. We conclude: (1) Adequate approximation to a Gaussian distribution can be achieved using LORETA by using a log10 transform or a Box-Cox transform and parametric statistics, (2) a Z-Score normative database is valid with adequate sensitivity when using LORETA, and (3) the Z-score LORETA normative database also consistently localized known pathologies to the expected Brodmann areas as an hypothesis test based on the surface EEG before computing LORETA.

  8. Candida antartica lipase B catalyzed polycaprolactone synthesis: effects of organic media and temperature.

    PubMed

    Kumar, A; Gross, R A

    2000-01-01

    Engineering of the reaction medium and study of an expanded range of reaction temperatures were carried out in an effort to positively influence the outcome of Novozyme-435 (immobilized Lipase B from Candida antarctica) catalyzed epsilon-CL polymerizations. A series of solvents including acetonitrile, dioxane, tetrahydrofuran, chloroform, butyl ether, isopropyl ether, isooctane, and toluene (log P from -1.1 to 4.5) were evaluated at 70 degrees C. Statistically (ANOVA), two significant regions were observed. Solvents having log P values from -1.1 to 0.49 showed low propagation rates (< or = 30% epsilon-CL conversion in 4 h) and gave products of short chain length (Mn < or = 5200 g/mol). In contrast, solvents with log P values from 1.9 to 4.5 showed enhanced propagation rates and afforded polymers of higher molecular weight (Mn = 11,500-17,000 g/mol). Toluene, a preferred solvent for this work, was studied at epsilon-CL to toluene (wt/vol) ratios from 1:1 to 10:1. The ratio 1:2 was selected since, for polymerizations at 70 degrees C, 0.3 mL of epsilon-CL and 4 h, gave high monomer conversions and Mn values (approximately 85% and approximately 17,000 g/mol, respectively). Increasing the scale of the reaction from 0.3 to 10 mL of CL resulted in a similar isolated product yield, but the Mn increased from 17,200 to 44,800 g/mol. Toluene appeared to help stabilize Novozyme-435 so that lipase-catalyzed polymerizations could be conducted effectively at 90 degrees C. For example, within only 2 h at 90 degrees C (toluene-d8 to epsilon-CL, 5:1, approximately 1% protein), the % monomer conversion reached approximately 90%. Also, the controlled character of these polymerizations as a function of reaction temperature was evaluated.

  9. Statistical methods of fracture characterization using acoustic borehole televiewer log interpretation

    NASA Astrophysics Data System (ADS)

    Massiot, Cécile; Townend, John; Nicol, Andrew; McNamara, David D.

    2017-08-01

    Acoustic borehole televiewer (BHTV) logs provide measurements of fracture attributes (orientations, thickness, and spacing) at depth. Orientation, censoring, and truncation sampling biases similar to those described for one-dimensional outcrop scanlines, and other logging or drilling artifacts specific to BHTV logs, can affect the interpretation of fracture attributes from BHTV logs. K-means, fuzzy K-means, and agglomerative clustering methods provide transparent means of separating fracture groups on the basis of their orientation. Fracture spacing is calculated for each of these fracture sets. Maximum likelihood estimation using truncated distributions permits the fitting of several probability distributions to the fracture attribute data sets within truncation limits, which can then be extrapolated over the entire range where they naturally occur. Akaike Information Criterion (AIC) and Schwartz Bayesian Criterion (SBC) statistical information criteria rank the distributions by how well they fit the data. We demonstrate these attribute analysis methods with a data set derived from three BHTV logs acquired from the high-temperature Rotokawa geothermal field, New Zealand. Varying BHTV log quality reduces the number of input data points, but careful selection of the quality levels where fractures are deemed fully sampled increases the reliability of the analysis. Spacing data analysis comprising up to 300 data points and spanning three orders of magnitude can be approximated similarly well (similar AIC rankings) with several distributions. Several clustering configurations and probability distributions can often characterize the data at similar levels of statistical criteria. Thus, several scenarios should be considered when using BHTV log data to constrain numerical fracture models.

  10. Bird species and traits associated with logged and unlogged forest in Borneo.

    PubMed

    Cleary, Daniel F R; Boyle, Timothy J B; Setyawati, Titiek; Anggraeni, Celina D; Van Loon, E Emiel; Menken, Steph B J

    2007-06-01

    The ecological consequences of logging have been and remain a focus of considerable debate. In this study, we assessed bird species composition within a logging concession in Central Kalimantan, Indonesian Borneo. Within the study area (approximately 196 km2) a total of 9747 individuals of 177 bird species were recorded. Our goal was to identify associations between species traits and environmental variables. This can help us to understand the causes of disturbance and predict whether species with given traits will persist under changing environmental conditions. Logging, slope position, and a number of habitat structure variables including canopy cover and liana abundance were significantly related to variation in bird composition. In addition to environmental variables, spatial variables also explained a significant amount of variation. However, environmental variables, particularly in relation to logging, were of greater importance in structuring variation in composition. Environmental change following logging appeared to have a pronounced effect on the feeding guild and size class structure but there was little evidence of an effect on restricted range or threatened species although certain threatened species were adversely affected. For example, species such as the terrestrial insectivore Argusianus argus and the hornbill Buceros rhinoceros, both of which are threatened, were rare or absent in recently logged forest. In contrast, undergrowth insectivores such as Orthotomus atrogularis and Trichastoma rostratum were abundant in recently logged forest and rare in unlogged forest. Logging appeared to have the strongest negative effect on hornbills, terrestrial insectivores, and canopy bark-gleaning insectivores while moderately affecting canopy foliage-gleaning insectivores and frugivores, raptors, and large species in general. In contrast, undergrowth insectivores responded positively to logging while most understory guilds showed little pronounced effect. Despite the high species richness of logged forest, logging may still have a negative impact on extant diversity by adversely affecting key ecological guilds. The sensitivity of hornbills in particular to logging disturbance may be expected to alter rainforest dynamics by seriously reducing the effective seed dispersal of associated tree species. However, logged forest represents an increasingly important habitat for most bird species and needs to be protected from further degradation. Biodiversity management within logging concessions should focus on maintaining large areas of unlogged forest and mitigating the adverse effects of logging on sensitive groups of species.

  11. Finite-difference modeling of the electroseismic logging in a fluid-saturated porous formation

    NASA Astrophysics Data System (ADS)

    Guan, Wei; Hu, Hengshan

    2008-05-01

    In a fluid-saturated porous medium, an electromagnetic (EM) wavefield induces an acoustic wavefield due to the electrokinetic effect. A potential geophysical application of this effect is electroseismic (ES) logging, in which the converted acoustic wavefield is received in a fluid-filled borehole to evaluate the parameters of the porous formation around the borehole. In this paper, a finite-difference scheme is proposed to model the ES logging responses to a vertical low frequency electric dipole along the borehole axis. The EM field excited by the electric dipole is calculated separately by finite-difference first, and is considered as a distributed exciting source term in a set of extended Biot's equations for the converted acoustic wavefield in the formation. This set of equations is solved by a modified finite-difference time-domain (FDTD) algorithm that allows for the calculation of dynamic permeability so that it is not restricted to low-frequency poroelastic wave problems. The perfectly matched layer (PML) technique without splitting the fields is applied to truncate the computational region. The simulated ES logging waveforms approximately agree with those obtained by the analytical method. The FDTD algorithm applies also to acoustic logging simulation in porous formations.

  12. Fractured-aquifer hydrogeology from geophysical logs; the passaic formation, New Jersey

    USGS Publications Warehouse

    Morin, R.H.; Carleton, G.B.; Poirier, S.

    1997-01-01

    The Passaic Formation consists of gradational sequences of mudstone, siltstone, and sandstone, and is a principal aquifer in central New Jersey. Ground-water flow is primarily controlled by fractures interspersed throughout these sedimentary rocks and characterizing these fractures in terms of type, orientation, spatial distribution, frequency, and transmissivity is fundamental towards understanding local fluid-transport processes. To obtain this information, a comprehensive suite of geophysical logs was collected in 10 wells roughly 46 m in depth and located within a .05 km2 area in Hopewell Township, New Jersey. A seemingly complex, heterogeneous network of fractures identified with an acoustic televiewer was statistically reduced to two principal subsets corresponding to two distinct fracture types: (1) bedding-plane partings and (2) high-angle fractures. Bedding-plane partings are the most numerous and have an average strike of N84??W and dip of 20??N. The high-angle fractures are oriented subparallel to these features, with an average strike of N79??E and dip of 71??S, making the two fracture types roughly orthogonal. Their intersections form linear features that also retain this approximately east-west strike. Inspection of fluid temperature and conductance logs in conjunction with flowmeter measurements obtained during pumping allows the transmissive fractures to be distinguished from the general fracture population. These results show that, within the resolution capabilities of the logging tools, approximately 51 (or 18 percent) of the 280 total fractures are water producing. The bedding-plane partings exhibit transmissivities that average roughly 5 m2/day and that generally diminish in magnitude and frequency with depth. The high-angle fractures have average transmissivities that are about half those of the bedding-plane partings and show no apparent dependence upon depth. The geophysical logging results allow us to infer a distinct hydrogeologic structure within this aquifer that is defined by fracture type and orientation. Fluid flow near the surface is controlled primarily by the highly transmissive, subhorizontal bedding-plane partings. As depth increases, the high-angle fractures apparently become more dominant hydrologically.The Passaic Formation consists of gradational sequences of mudstone, siltstone, and sandstone, and is a principal aquifer in central New Jersey. Ground-water flow is primarily controlled by fractures interspersed throughout these sedimentary rocks and characterizing these fractures in terms of type, orientation, spatial distribution, frequency, and transmissivity is fundamental towards understanding local fluid-transport processes. To obtain this information, a comprehensive suite of geophysical logs was collected in 10 wells roughly 46 m in depth and located within a .05 km2 area in Hopewell Township, New Jersey. A seemingly complex, heterogeneous network of fractures identified with an acoustic televiewer was statistically reduced to two principal subsets corresponding to two distinct fracture types: (1) bedding-plane partings and (2) high-angle fractures. Bedding-plane partings are the most numerous and have an average strike of N84?? W and dip of 20?? N. The high-angle fractures are oriented subparallel to these features, with an average strike of N79?? E and dip of 71?? S, making the two fracture types roughly orthogonal. Their intersections form linear features that also retain this approximately east-west strike. Inspection of fluid temperature and conductance logs in conjunction with flowmeter measurements obtained during pumping allows the transmissive fractures to be distinguished from the general fracture population. These results show that, within the resolution capabilities of the logging tools, approximately 51 (or 18 percent) of the 280 total fractures are water producing. The bedding-plane partings exhibit transmissivities that average roughly 5 m2/day and that generally dimi

  13. Early Addition of Topical Corticosteroids in the Treatment of Bacterial Keratitis

    PubMed Central

    Ray, Kathryn J.; Srinivasan, Muthiah; Mascarenhas, Jeena; Rajaraman, Revathi; Ravindran, Meenakshi; Glidden, David V.; Oldenburg, Catherine E.; Sun, Catherine Q.; Zegans, Michael E.; McLeod, Stephen D.; Acharya, Nisha R.; Lietman, Thomas M.

    2014-01-01

    IMPORTANCE Scarring from bacterial keratitis remains a leading cause of visual loss. OBJECTIVE To determine whether topical corticosteroids are beneficial as an adjunctive therapy for bacterial keratitis if given early in the course of infection. DESIGN, SETTING, AND PARTICIPANTS The Steroids for Corneal Ulcers Trial (SCUT) was a randomized, double-masked, placebo-controlled trial that overall found no effect of adding topical corticosteroids to topical moxifloxacin hydrochloride in bacterial keratitis. Here, we assess the timing of administration of corticosteroids in a subgroup analysis of the SCUT. We define earlier administration of corticosteroids (vs placebo) as addition after 2 to 3 days of topical antibiotics and later as addition after 4 or more days of topical antibiotics. MAIN OUTCOMES AND MEASURES We assess the effect of topical corticosteroids (vs placebo) on 3-month best spectacle-corrected visual acuity in patients who received corticosteroids or placebo earlier vs later. Further analyses were performed for subgroups of patients with non-Nocardia keratitis and those with no topical antibiotic use before enrollment. RESULTS Patients treated with topical corticosteroids as adjunctive therapy within 2 to 3 days of antibiotic therapy had approximately 1-line better visual acuity at 3 months than did those given placebo (−0.11 logMAR; 95% CI, −0.20 to −0.02 logMAR; P = .01). In patients who had 4 or more days of antibiotic therapy before corticosteroid treatment, the effect was not significant; patients given corticosteroids had 1-line worse visual acuity at 3 months compared with those in the placebo group (0.10 logMAR; 95% CI, −0.02 to 0.23 logMAR; P = .14). Patients with non-Nocardia keratitis and those having no topical antibiotic use before the SCUT enrollment showed significant improvement in best spectacle-corrected visual acuity at 3 months if corticosteroids were administered earlier rather than later. CONCLUSIONS AND RELEVANCE There may be a benefit with adjunctive topical corticosteroids if application occurs earlier in the course of bacterial corneal ulcers. PMID:24763755

  14. Characterization of complexes of nucleoside-5'-phosphorothioate analogues with zinc ions.

    PubMed

    Sayer, Alon Haim; Itzhakov, Yehudit; Stern, Noa; Nadel, Yael; Fischer, Bilha

    2013-10-07

    On the basis of the high affinity of Zn(2+) to sulfur and imidazole, we targeted nucleotides such as GDP-β-S, ADP-β-S, and AP3(β-S)A, as potential biocompatible Zn(2+)-chelators. The thiophosphate moiety enhanced the stability of the Zn(2+)-nucleotide complex by about 0.7 log units. ATP-α,β-CH2-γ-S formed the most stable Zn(2+)-complex studied here, log K 6.50, being ~0.8 and ~1.1 log units more stable than ATP-γ-S-Zn(2+) and ATP-Zn(2+) complexes, and was the major species, 84%, under physiological pH. Guanine nucleotides Zn(2+) complexes were more stable by 0.3-0.4 log units than the corresponding adenine nucleotide complexes. Likewise, AP3(β-S)A-zinc complex was ~0.5 log units more stable than AP3A complex. (1)H- and (31)P NMR monitored Zn(2+) titration showed that Zn(2+) coordinates with the purine nucleotide N7-nitrogen atom, the terminal phosphate, and the adjacent phosphate. In conclusion, replacement of a terminal phosphate by a thiophosphate group resulted in decrease of the acidity of the phosphate moiety by approximately one log unit, and increase of stability of Zn(2+)-complexes of the latter analogues by up to 0.7 log units. A terminal phosphorothioate contributed more to the stability of nucleotide-Zn(2+) complexes than a bridging phosphorothioate.

  15. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  16. Numerical Modeling of Electroacoustic Logging Including Joule Heating

    NASA Astrophysics Data System (ADS)

    Plyushchenkov, Boris D.; Nikitin, Anatoly A.; Turchaninov, Victor I.

    It is well known that electromagnetic field excites acoustic wave in a porous elastic medium saturated with fluid electrolyte due to electrokinetic conversion effect. Pride's equations describing this process are written in isothermal approximation. Update of these equations, which allows to take influence of Joule heating on acoustic waves propagation into account, is proposed here. This update includes terms describing the initiation of additional acoustic waves excited by thermoelastic stresses and the heat conduction equation with right side defined by Joule heating. Results of numerical modeling of several problems of propagation of acoustic waves excited by an electric field source with and without consideration of Joule heating effect in their statements are presented. From these results, it follows that influence of Joule heating should be taken into account at the numerical simulation of electroacoustic logging and at the interpretation of its log data.

  17. Analysis of DNS Cache Effects on Query Distribution

    PubMed Central

    2013-01-01

    This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally. PMID:24396313

  18. Analysis of DNS cache effects on query distribution.

    PubMed

    Wang, Zheng

    2013-01-01

    This paper studies the DNS cache effects that occur on query distribution at the CN top-level domain (TLD) server. We first filter out the malformed DNS queries to purify the log data pollution according to six categories. A model for DNS resolution, more specifically DNS caching, is presented. We demonstrate the presence and magnitude of DNS cache effects and the cache sharing effects on the request distribution through analytic model and simulation. CN TLD log data results are provided and analyzed based on the cache model. The approximate TTL distribution for domain name is inferred quantificationally.

  19. Molecular and Genetic Investigation of Tau in Chronic Traumatic Encephalopathy (Log No. 13267017)

    DTIC Science & Technology

    2017-10-01

    AWARD NUMBER: W81XWH-14-1-0399 TITLE: Molecular & Genetic Investigation of Tau in Chronic Traumatic Encephalopathy (Log No. 13267017) PRINCIPAL...neuropathological findings we are currently characterizing in individuals with CTE reflect molecular and genetic differences that will enable the...INTRODUCTION: Repetitive mild traumatic brain injury leads to neurological symptoms and chronic traumatic encephalopathy (CTE). The molecular changes

  20. Changes in South Carolina's industrial timber products output, 1988

    Treesearch

    Edgar L. Davenport; John B. Tansey

    1990-01-01

    In 1988, roundwood output from South Carolina's forests amounted to 605.1 million cubic feet. Residue generated from plants in South Carolina increased to 205.9 million cubic feet and was mostly used for fiber and fuel. Pulpwood was the leading roundwood product with 283.2 million cubic feet; saw logs was next with 253.7 million cubic feet, and then veneer logs...

  1. High Pressure Inactivation of HAV within Mussels

    USDA-ARS?s Scientific Manuscript database

    The potential of hepatitis A virus (HAV) to be inactivated within Mediterranean mussels (Mytilus galloprovincialis) and blue mussels (Mytilus edulis) by high pressure processing was evaluated. HAV was bioaccumulated within mussels to approximately 6-log10 PFU by exposure of mussels to HAV-contamina...

  2. TENSOR DECOMPOSITIONS AND SPARSE LOG-LINEAR MODELS

    PubMed Central

    Johndrow, James E.; Bhattacharya, Anirban; Dunson, David B.

    2017-01-01

    Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. We derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions. PMID:29332971

  3. Possible Statistics of Two Coupled Random Fields: Application to Passive Scalar

    NASA Technical Reports Server (NTRS)

    Dubrulle, B.; He, Guo-Wei; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    We use the relativity postulate of scale invariance to derive the similarity transformations between two coupled scale-invariant random elds at different scales. We nd the equations leading to the scaling exponents. This formulation is applied to the case of passive scalars advected i) by a random Gaussian velocity field; and ii) by a turbulent velocity field. In the Gaussian case, we show that the passive scalar increments follow a log-Levy distribution generalizing Kraichnan's solution and, in an appropriate limit, a log-normal distribution. In the turbulent case, we show that when the velocity increments follow a log-Poisson statistics, the passive scalar increments follow a statistics close to log-Poisson. This result explains the experimental observations of Ruiz et al. about the temperature increments.

  4. Incorporating Canopy Cover for Airborne-Derived Assessments of Forest Biomass in the Tropical Forests of Cambodia

    PubMed Central

    Singh, Minerva; Evans, Damian; Coomes, David A.; Friess, Daniel A.; Suy Tan, Boun; Samean Nin, Chan

    2016-01-01

    This research examines the role of canopy cover in influencing above ground biomass (AGB) dynamics of an open canopied forest and evaluates the efficacy of individual-based and plot-scale height metrics in predicting AGB variation in the tropical forests of Angkor Thom, Cambodia. The AGB was modeled by including canopy cover from aerial imagery alongside with the two different canopy vertical height metrics derived from LiDAR; the plot average of maximum tree height (Max_CH) of individual trees, and the top of the canopy height (TCH). Two different statistical approaches, log-log ordinary least squares (OLS) and support vector regression (SVR), were used to model AGB variation in the study area. Ten different AGB models were developed using different combinations of airborne predictor variables. It was discovered that the inclusion of canopy cover estimates considerably improved the performance of AGB models for our study area. The most robust model was log-log OLS model comprising of canopy cover only (r = 0.87; RMSE = 42.8 Mg/ha). Other models that approximated field AGB closely included both Max_CH and canopy cover (r = 0.86, RMSE = 44.2 Mg/ha for SVR; and, r = 0.84, RMSE = 47.7 Mg/ha for log-log OLS). Hence, canopy cover should be included when modeling the AGB of open-canopied tropical forests. PMID:27176218

  5. Machine learning with neural networks - a case study of estimating thermal conductivity with ancient well-log data

    NASA Astrophysics Data System (ADS)

    Harrison, Benjamin; Sandiford, Mike; McLaren, Sandra

    2016-04-01

    Supervised machine learning algorithms attempt to build a predictive model using empirical data. Their aim is to take a known set of input data along with known responses to the data, and adaptively train a model to generate predictions for new data inputs. A key attraction to their use is the ability to perform as function approximators where the definition of an explicit relationship between variables is infeasible. We present a novel means of estimating thermal conductivity using a supervised self-organising map algorithm, trained on about 150 thermal conductivity measurements, and using a suite of five electric logs common to 14 boreholes. A key motivation of the study was to supplement the small number of direct measurements of thermal conductivity with the decades of borehole data acquired in the Gippsland Basin to produce more confident calculations of surface heat flow. A previous attempt to generate estimates from well-log data in the Gippsland Basin using classic petrophysical log interpretation methods was able to produce reasonable synthetic thermal conductivity logs for only four boreholes. The current study has extended this to a further ten boreholes. Interesting outcomes from the study are: the method appears stable at very low sample sizes (< ~100); the SOM permits quantitative analysis of essentially qualitative uncalibrated well-log data; and the method's moderate success at prediction with minimal effort tuning the algorithm's parameters.

  6. Incorporating Canopy Cover for Airborne-Derived Assessments of Forest Biomass in the Tropical Forests of Cambodia.

    PubMed

    Singh, Minerva; Evans, Damian; Coomes, David A; Friess, Daniel A; Suy Tan, Boun; Samean Nin, Chan

    2016-01-01

    This research examines the role of canopy cover in influencing above ground biomass (AGB) dynamics of an open canopied forest and evaluates the efficacy of individual-based and plot-scale height metrics in predicting AGB variation in the tropical forests of Angkor Thom, Cambodia. The AGB was modeled by including canopy cover from aerial imagery alongside with the two different canopy vertical height metrics derived from LiDAR; the plot average of maximum tree height (Max_CH) of individual trees, and the top of the canopy height (TCH). Two different statistical approaches, log-log ordinary least squares (OLS) and support vector regression (SVR), were used to model AGB variation in the study area. Ten different AGB models were developed using different combinations of airborne predictor variables. It was discovered that the inclusion of canopy cover estimates considerably improved the performance of AGB models for our study area. The most robust model was log-log OLS model comprising of canopy cover only (r = 0.87; RMSE = 42.8 Mg/ha). Other models that approximated field AGB closely included both Max_CH and canopy cover (r = 0.86, RMSE = 44.2 Mg/ha for SVR; and, r = 0.84, RMSE = 47.7 Mg/ha for log-log OLS). Hence, canopy cover should be included when modeling the AGB of open-canopied tropical forests.

  7. Potential of the octanol-water partition coefficient (logP) to predict the dermal penetration behaviour of amphiphilic compounds in aqueous solutions.

    PubMed

    Korinth, Gintautas; Wellner, Tanja; Schaller, Karl Heinz; Drexler, Hans

    2012-11-23

    Aqueous amphiphilic compounds may exhibit enhanced skin penetration compared with neat compounds. Conventional models do not predict this percutaneous penetration behaviour. We investigated the potential of the octanol-water partition coefficient (logP) to predict dermal fluxes for eight compounds applied neat and as 50% aqueous solutions in diffusion cell experiments using human skin. Data for seven other compounds were accessed from literature. In total, seven glycol ethers, three alcohols, two glycols, and three other chemicals were considered. Of these 15 compounds, 10 penetrated faster through the skin as aqueous solutions than as neat compounds. The other five compounds exhibited larger fluxes as neat applications. For 13 of the 15 compounds, a consistent relationship was identified between the percutaneous penetration behaviour and the logP. Compared with the neat applications, positive logP were associated with larger fluxes for eight of the diluted compounds, and negative logP were associated with smaller fluxes for five of the diluted compounds. Our study demonstrates that decreases or enhancements in dermal penetration upon aqueous dilution can be predicted for many compounds from the sign of logP (i.e., positive or negative). This approach may be suitable as a first approximation in risk assessments of dermal exposure. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Power law behavior of RR-interval variability in healthy middle-aged persons, patients with recent acute myocardial infarction, and patients with heart transplants

    NASA Technical Reports Server (NTRS)

    Bigger, J. T. Jr; Steinman, R. C.; Rolnitzky, L. M.; Fleiss, J. L.; Albrecht, P.; Cohen, R. J.

    1996-01-01

    BACKGROUND. The purposes of the present study were (1) to establish normal values for the regression of log(power) on log(frequency) for, RR-interval fluctuations in healthy middle-aged persons, (2) to determine the effects of myocardial infarction on the regression of log(power) on log(frequency), (3) to determine the effect of cardiac denervation on the regression of log(power) on log(frequency), and (4) to assess the ability of power law regression parameters to predict death after myocardial infarction. METHODS AND RESULTS. We studied three groups: (1) 715 patients with recent myocardial infarction; (2) 274 healthy persons age and sex matched to the infarct sample; and (3) 19 patients with heart transplants. Twenty-four-hour RR-interval power spectra were computed using fast Fourier transforms and log(power) was regressed on log(frequency) between 10(-4) and 10(-2) Hz. There was a power law relation between log(power) and log(frequency). That is, the function described a descending straight line that had a slope of approximately -1 in healthy subjects. For the myocardial infarction group, the regression line for log(power) on log(frequency) was shifted downward and had a steeper negative slope (-1.15). The transplant (denervated) group showed a larger downward shift in the regression line and a much steeper negative slope (-2.08). The correlation between traditional power spectral bands and slope was weak, and that with log(power) at 10(-4) Hz was only moderate. Slope and log(power) at 10(-4) Hz were used to predict mortality and were compared with the predictive value of traditional power spectral bands. Slope and log(power) at 10(-4) Hz were excellent predictors of all-cause mortality or arrhythmic death. To optimize the prediction of death, we calculated a log(power) intercept that was uncorrelated with the slope of the power law regression line. We found that the combination of slope and zero-correlation log(power) was an outstanding predictor, with a relative risk of > 10, and was better than any combination of the traditional power spectral bands. The combination of slope and log(power) at 10(-4) Hz also was an excellent predictor of death after myocardial infarction. CONCLUSIONS. Myocardial infarction or denervation of the heart causes a steeper slope and decreased height of the power law regression relation between log(power) and log(frequency) of RR-interval fluctuations. Individually and, especially, combined, the power law regression parameters are excellent predictors of death of any cause or arrhythmic death and predict these outcomes better than the traditional power spectral bands.

  9. LETTERS AND COMMENTS: Note on the 'log formulae' for pendulum motion valid for any amplitude

    NASA Astrophysics Data System (ADS)

    Qing-Xin, Yuan; Pei, Ding

    2010-01-01

    In this note, we present an improved approximation to the solution of Lima (2008 Eur. J. Phys. 29 1091), which decreases the maximum relative error from 0.6% to 0.084% in evaluating the exact pendulum period.

  10. Box-Cox transformation of firm size data in statistical analysis

    NASA Astrophysics Data System (ADS)

    Chen, Ting Ting; Takaishi, Tetsuya

    2014-03-01

    Firm size data usually do not show the normality that is often assumed in statistical analysis such as regression analysis. In this study we focus on two firm size data: the number of employees and sale. Those data deviate considerably from a normal distribution. To improve the normality of those data we transform them by the Box-Cox transformation with appropriate parameters. The Box-Cox transformation parameters are determined so that the transformed data best show the kurtosis of a normal distribution. It is found that the two firm size data transformed by the Box-Cox transformation show strong linearity. This indicates that the number of employees and sale have the similar property as a firm size indicator. The Box-Cox parameters obtained for the firm size data are found to be very close to zero. In this case the Box-Cox transformations are approximately a log-transformation. This suggests that the firm size data we used are approximately log-normal distributions.

  11. Inactivation of Mycobacterium paratuberculosis by Pulsed Electric Fields

    PubMed Central

    Rowan, Neil J.; MacGregor, Scott J.; Anderson, John G.; Cameron, Douglas; Farish, Owen

    2001-01-01

    The influence of treatment temperature and pulsed electric fields (PEF) on the viability of Mycobacterium paratuberculosis cells suspended in 0.1% (wt/vol) peptone water and in sterilized cow's milk was assessed by direct viable counts and by transmission electron microscopy (TEM). PEF treatment at 50°C (2,500 pulses at 30 kV/cm) reduced the level of viable M. paratuberculosis cells by approximately 5.3 and 5.9 log10 CFU/ml in 0.1% peptone water and in cow's milk, respectively, while PEF treatment of M. paratuberculosis at lower temperatures resulted in less lethality. Heating alone at 50°C for 25 min or at 72°C for 25 s (extended high-temperature, short-time pasteurization) resulted in reductions of M. paratuberculosis of approximately 0.01 and 2.4 log10 CFU/ml, respectively. TEM studies revealed that exposure to PEF treatment resulted in substantial damage at the cellular level to M. paratuberculosis. PMID:11375202

  12. Towards a PTAS for the generalized TSP in grid clusters

    NASA Astrophysics Data System (ADS)

    Khachay, Michael; Neznakhina, Katherine

    2016-10-01

    The Generalized Traveling Salesman Problem (GTSP) is a combinatorial optimization problem, which is to find a minimum cost cycle visiting one point (city) from each cluster exactly. We consider a geometric case of this problem, where n nodes are given inside the integer grid (in the Euclidean plane), each grid cell is a unit square. Clusters are induced by cells `populated' by nodes of the given instance. Even in this special setting, the GTSP remains intractable enclosing the classic Euclidean TSP on the plane. Recently, it was shown that the problem has (1.5+8√2+ɛ)-approximation algorithm with complexity bound depending on n and k polynomially, where k is the number of clusters. In this paper, we propose two approximation algorithms for the Euclidean GTSP on grid clusters. For any fixed k, both algorithms are PTAS. Time complexity of the first one remains polynomial for k = O(log n) while the second one is a PTAS, when k = n - O(log n).

  13. Effect of oxygen stress on growth and survival of Clostridium perfringens, Campylobacter jejuni, and Listeria monocytogenes under different storage conditions.

    PubMed

    Al-Qadiri, Hamzah; Sablani, Shyam S; Ovissipour, Mahmoudreza; Al-Alami, Nivin; Govindan, Byju; Rasco, Barbara

    2015-04-01

    This study investigated the growth and survival of three foodborne pathogens (Clostridium perfringens, Campylobacter jejuni, and Listeria monocytogenes) in beef (7% fat) and nutrient broth under different oxygen levels. Samples were tested under anoxic (<0.5%), microoxic (6 to 8%), and oxic (20%) conditions during storage at 7 °C for 14 days and at 22 °C for 5 days. Two initial inoculum concentrations were used (1 and 2 log CFU per g of beef or per ml of broth). The results show that C. perfringens could grow in beef at 22 °C, with an increase of approximately 5 log under anoxic conditions and a 1-log increase under microoxic conditions. However, C. perfringens could not survive in beef held at 7 °C under microoxic and oxic storage conditions after 14 days. In an anoxic environment, C. perfringens survived in beef samples held at 7 °C, with a 1-log reduction. A cell decline was observed at 2 log under these conditions, with no surviving cells at the 1-log level. However, the results show that C. jejuni under microoxic conditions survived with declining cell numbers. Significant increases in L. monocytogenes (5 to 7 log) were observed in beef held at 22 °C for 5 days, with the lowest levels recovered under anoxic conditions. L. monocytogenes in refrigerated storage increased by a factor of 2 to 4 log. It showed the greatest growth under oxic conditions, with significant growth under anoxic conditions. These findings can be used to enhance food safety in vacuum-packed and modified atmosphere-packaged food products.

  14. Nature and origin of upper crustal seismic velocity fluctuations and associated scaling properties: Combined stochastic analyses of KTB velocity and lithology logs

    USGS Publications Warehouse

    Goff, J.A.; Holliger, K.

    1999-01-01

    The main borehole of the German Continental Deep Drilling Program (KTB) extends over 9000 m into a crystalline upper crust consisting primarily of interlayered gneiss and metabasite. We present a joint analysis of the velocity and lithology logs in an effort to extract the lithology component of the velocity log. Covariance analysis of lithology log, approximated as a binary series, indicates that it may originate from the superposition of two Brownian stochastic processes (fractal dimension 1.5) with characteristic scales of ???2800 m and ???150 m, respectively. Covariance analysis of the velocity fluctuations provides evidence for the superposition of four stochastic process with distinct characteristic scales. The largest two scales are identical to those derived from the lithology, confirming that these scales of velocity heterogeneity are caused by lithology variations. The third characteristic scale, ???20 m, also a Brownian process, is probably related to fracturing based on correlation with the resistivity log. The superposition of these three Brownian processes closely mimics the commonly observed 1/k decay (fractal dimension 2.0) of the velocity power spectrum. The smallest scale process (characteristic scale ???1.7 m) requires a low fractal dimension, ???1.0, and accounts for ???60% of the total rms velocity variation. A comparison of successive logs from 6900-7140 m depth indicates that such variations are not repeatable and thus probably do not represent true velocity variations in the crust. The results of this study resolve disparity between the differing published estimates of seismic heterogeneity based on the KTB sonic logs, and bridge the gap between estimates of crustal heterogeneity from geologic maps and borehole logs. Copyright 1999 by the American Geophysical Union.

  15. Electrical Conductive Mechanism of Gas Hydrate-Bearing Reservoirs in the Permafrost Region of Qilian Mountain

    NASA Astrophysics Data System (ADS)

    Peng, C.; Zou, C.; Tang, Y.; Liu, A.; Hu, X.

    2017-12-01

    In the Qilian Mountain, gas hydrates not only occur in pore spaces of sandstones, but also fill in fractures of mudstones. This leads to the difficulty in identification and evaluation of gas hydrate reservoir from resistivity and velocity logs. Understanding electrical conductive mechanism is the basis for log interpretation. However, the research is insufficient in this area. We have collected well logs from 30 wells in this area. Well logs and rock samples from DK-9, DK-11 and DK-12 wells were used in this study. The experiments including SEM, thin section, NMR, XRD, synthesis of gas hydrate in consolidated rock cores under low temperature and measurement of their resistivity and others were performed for understanding the effects of pore structure, rock composition, temperature and gas hydrate on conductivity. The results show that the porosity of reservoir of pore filling type is less than 10% and its clay mineral content is high. As good conductive passages, fractures can reduce resistivity of water-saturated rock. If fractures in the mudstone are filled by calcite, resistivity increases significantly. The resistivity of water-saturated rock at 2°C is twice of that at 18°C. The gas hydrate formation process in the sandstone was studied by resistivity recorded in real time. In the early stage of gas hydrate formation, the increase of residual water salinity may lead to the decrease of resistivity. In the late stage of gas hydrate formation, the continuity decrease of water leads to continuity increase of resistivity. In summary, fractures, rock composition, temperature and gas hydrate are important factors influencing resistivity of formation. This study is helpful for more accurate evaluation of gas hydrate from resistivity log. Acknowledgment: We acknowledge the financial support of the National Special Program for Gas Hydrate Exploration and Test-production (GZH201400302).

  16. Analysis of ILRS Site Ties

    NASA Astrophysics Data System (ADS)

    Husson, V. S.; Long, J. L.; Pearlman, M.

    2001-12-01

    By the end of 2000, 94% of ILRS stations had completed station and site information forms (i.e. site logs). These forms contain six types of information. These six categories include site identifiers, contact information, approximate coordinates, system configuration history, system ranging capabilities, and local survey ties. The ILRS Central Bureau, in conjunction with the ILRS Networks and Engineering Working Group, has developed procedures to quality control site log contents. Part of this verification entails data integrity checks of local site ties and is the primary focus of this paper. Local survey ties are critical to the combination of space geodetic network coordinate solutions (i.e. GPS, SLR, VLBI, DORIS) of the International Terrestrial Reference Frame (ITRF). Approximately 90% of active SLR sites are collocated with at least one other space geodetic technique. The process used to verify these SLR ties, at collocated sites, is identical to the approach used in ITRF2000. Local vectors (X, Y, Z) from each ILRS site log are differenced from its corresponding ITRF2000 position vectors (i.e. no transformations). These X, Y, and Z deltas are converted into North, East, and Up. Any deltas, in any component, larger than 5 millimeter is flagged for investigation. In the absence of ITRF2000 SLR positions, CSR positions were used. To further enhance this comparison and to fill gaps in information, local ties contained in site logs from the other space geodetic services (i.e. IGS, IVS, IDS) were used in addition to ITRF2000 ties. Case studies of two collocated sites (McDonald/Ft. Davis and Hartebeeshtoek) will be explored in-depth. Recommendations on how local site surveys should be conducted and how this information should be managed will also be presented.

  17. Strategic Methodologies in Public Health Cost Analyses.

    PubMed

    Whittington, Melanie; Atherly, Adam; VanRaemdonck, Lisa; Lampe, Sarah

    The National Research Agenda for Public Health Services and Systems Research states the need for research to determine the cost of delivering public health services in order to assist the public health system in communicating financial needs to decision makers, partners, and health reform leaders. The objective of this analysis is to compare 2 cost estimation methodologies, public health manager estimates of employee time spent and activity logs completed by public health workers, to understand to what degree manager surveys could be used in lieu of more time-consuming and burdensome activity logs. Employees recorded their time spent on communicable disease surveillance for a 2-week period using an activity log. Managers then estimated time spent by each employee on a manager survey. Robust and ordinary least squares regression was used to measure the agreement between the time estimated by the manager and the time recorded by the employee. The 2 outcomes for this study included time recorded by the employee on the activity log and time estimated by the manager on the manager survey. This study was conducted in local health departments in Colorado. Forty-one Colorado local health departments (82%) agreed to participate. Seven of the 8 models showed that managers underestimate their employees' time, especially for activities on which an employee spent little time. Manager surveys can best estimate time for time-intensive activities, such as total time spent on a core service or broad public health activity, and yet are less precise when estimating discrete activities. When Public Health Services and Systems Research researchers and health departments are conducting studies to determine the cost of public health services, there are many situations in which managers can closely approximate the time required and produce a relatively precise approximation of cost without as much time investment by practitioners.

  18. Addition to thermized milk of Lactococcus lactis subsp. cremoris M104, a wild, novel nisin a-producing strain, replaces the natural antilisterial activity of the autochthonous raw milk microbiota reduced by thermization.

    PubMed

    Lianou, Alexandra; Samelis, John

    2014-08-01

    Recent research has shown that mild milk thermization treatments routinely used in traditional Greek cheese production are efficient to inactivate Listeria monocytogenes and other pathogenic or undesirable bacteria, but they also inactivate a great part of the autochthonous antagonistic microbiota of raw milk. Therefore, in this study, the antilisterial activity of raw or thermized (63°C, 30 s) milk in the presence or absence of Lactococcus lactis subsp. cremoris M104, a wild, novel, nisin A-producing (Nis-A+) raw milk isolate, was assessed. Bulk milk samples were taken from a local cheese plant before or after thermization and were inoculated with a five-strain cocktail of L. monocytogenes (approximately 4 log CFU/ml) or with the cocktail, as above, plus the Nis-A+ strain (approximately 6 log CFU/ml) as a bioprotective culture. Heat-sterilized (121°C, 5 min) raw milk inoculated with L. monocytogenes was used as a control treatment. All milk samples were incubated at 37°C for 6 h and then at 18°C for an additional 66 h. L. monocytogenes grew abundantly (>8 log CFU/ml) in heat-sterilized milk, whereas its growth was completely inhibited in all raw milk samples. Conversely, in thermized milk, L. monocytogenes increased by 2 log CFU/ml in the absence of strain M104, whereas its growth was completely inhibited in the presence of strain M104. Furthermore, nisin activity was detected only in milk samples inoculated with strain M104. Thus, postthermal supplementation of thermized bulk milk with bioprotective L. lactis subsp. cremoris cultures replaces the natural antilisterial activity of raw milk reduced by thermization.

  19. A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions

    DOE PAGES

    Xu, Kuan; Austin, Anthony P.; Wei, Ke

    2017-12-21

    In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.

  20. A Fast Algorithm for the Convolution of Functions with Compact Support Using Fourier Extensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Kuan; Austin, Anthony P.; Wei, Ke

    In this paper, we present a new algorithm for computing the convolution of two compactly supported functions. The algorithm approximates the functions to be convolved using Fourier extensions and then uses the fast Fourier transform to efficiently compute Fourier extension approximations to the pieces of the result. Finally, the complexity of the algorithm is O(N(log N) 2), where N is the number of degrees of freedom used in each of the Fourier extensions.

  1. Optimised method to estimate octanol water distribution coefficient (logD) in a high throughput format.

    PubMed

    Low, Ying Wei Ivan; Blasco, Francesca; Vachaspati, Prakash

    2016-09-20

    Lipophilicity is one of the molecular properties assessed in early drug discovery. Direct measurement of the octanol-water distribution coefficient (logD) requires an analytical method with a large dynamic range or multistep dilutions, as the analyte's concentrations span across several orders of magnitude. In addition, water/buffer and octanol phases which have very different polarity could lead to matrix effects and affect the LC-MS response, leading to erroneous logD values. Most compound libraries use DMSO stocks as it greatly reduces the sample requirement but the presence of DMSO has been shown to underestimate the lipophilicity of the analyte. The present work describes the development of an optimised shake flask logD method using deepwell 96 well plate that addresses the issues related to matrix effects, DMSO concentration and incubation conditions and is also amenable to high throughput. Our results indicate that the equilibrium can be achieved within 30min by flipping the plate on its side while even 0.5% of DMSO is not tolerated in the assay. This study uses the matched matrix concept to minimise the errors in analysing the two phases namely buffer and octanol in LC-MS. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Diversity of microbiota found in coffee processing wastewater treatment plant.

    PubMed

    Pires, Josiane Ferreira; Cardoso, Larissa de Souza; Schwan, Rosane Freitas; Silva, Cristina Ferreira

    2017-11-13

    Cultivable microbiota presents in a coffee semi-dry processing wastewater treatment plant (WTP) was identified. Thirty-two operational taxonomic units (OTUs) were detected, these being 16 bacteria, 11 yeasts and 4 filamentous fungi. Bacteria dominated the microbial population (11.61 log CFU mL - 1 ), and presented the highest total diversity index when observed in the WTP aerobic stage (Shannon = 1.94 and Simpson = 0.81). The most frequent bacterial species were Enterobacter asburiae, Sphingobacterium griseoflavum, Chryseobacterium bovis, Serratia marcescens, Corynebacterium flavescens, Acetobacter orientalis and Acetobacter indonesiensis; these showed the largest total bacteria populations in the WTP, with approximately 10 log CFU mL - 1 . Yeasts were present at 7 log CFU mL - 1 of viable cells, with Hanseniaspora uvarum, Wickerhamomyces anomalus, Torulaspora delbrueckii, Saturnispora gosingensis, and Kazachstania gamospora being the prevalent species. Filamentous fungi were found at 6 log CFU mL - 1 , with Fusarium oxysporum the most populous species. The identified species have the potential to act as a biological treatment in the WTP, and the application of them for this purpose must be better studied.

  3. Inactivation of indigenous coliform bacteria in unfiltered surface water by ultraviolet light.

    PubMed

    Cantwell, Raymond E; Hofmann, Ron

    2008-05-01

    This study examined the potential for naturally occurring particles to protect indigenous coliform from ultraviolet (UV) disinfection in four surface waters. Tailing in the UV dose-response curve of the bacteria was observed in 3 of the 4 water samples after 1.3-2.6-log of log-linear inactivation, implying particle-related protection. The impact of particles was confirmed by comparing coliform UV inactivation data for parallel filtered (11 microm pore-size nylon filters) and unfiltered surface water. In samples from the Grand River (UVT: 65%/cm; 5.4 nephelometric turbidity units (NTU)) and the Rideau Canal (UVT: 60%/cm; 0.84 NTU), a limit of approximately 2.5 log inactivation was achieved in the unfiltered samples for a UV dose of 20 mJ/cm2 while both the filtered samples exhibited >3.4-log inactivation of indigenous coliform bacteria. The results suggest that particles as small as 11 microm, naturally found in surface water with low turbidity (<3NTU), are able to harbor indigenous coliform bacteria and offer protection from low-pressure UV light.

  4. Approximate matching of regular expressions.

    PubMed

    Myers, E W; Miller, W

    1989-01-01

    Given a sequence A and regular expression R, the approximate regular expression matching problem is to find a sequence matching R whose optimal alignment with A is the highest scoring of all such sequences. This paper develops an algorithm to solve the problem in time O(MN), where M and N are the lengths of A and R. Thus, the time requirement is asymptotically no worse than for the simpler problem of aligning two fixed sequences. Our method is superior to an earlier algorithm by Wagner and Seiferas in several ways. First, it treats real-valued costs, in addition to integer costs, with no loss of asymptotic efficiency. Second, it requires only O(N) space to deliver just the score of the best alignment. Finally, its structure permits implementation techniques that make it extremely fast in practice. We extend the method to accommodate gap penalties, as required for typical applications in molecular biology, and further refine it to search for sub-strings of A that strongly align with a sequence in R, as required for typical data base searches. We also show how to deliver an optimal alignment between A and R in only O(N + log M) space using O(MN log M) time. Finally, an O(MN(M + N) + N2log N) time algorithm is presented for alignment scoring schemes where the cost of a gap is an arbitrary increasing function of its length.

  5. Improvement of Reynolds-Stress and Triple-Product Lag Models

    NASA Technical Reports Server (NTRS)

    Olsen, Michael E.; Lillard, Randolph P.

    2017-01-01

    The Reynolds-stress and triple product Lag models were created with a normal stress distribution which was denied by a 4:3:2 distribution of streamwise, spanwise and wall normal stresses, and a ratio of r(sub w) = 0.3k in the log layer region of high Reynolds number flat plate flow, which implies R11(+)= [4/(9/2)*.3] approximately 2.96. More recent measurements show a more complex picture of the log layer region at high Reynolds numbers. The first cut at improving these models along with the direction for future refinements is described. Comparison with recent high Reynolds number data shows areas where further work is needed, but also shows inclusion of the modeled turbulent transport terms improve the prediction where they influence the solution. Additional work is needed to make the model better match experiment, but there is significant improvement in many of the details of the log layer behavior.

  6. Accounting for measurement error in log regression models with applications to accelerated testing.

    PubMed

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  7. Distribution of Animal Drugs between Skim Milk and Milk Fat Fractions in Spiked Whole Milk: Understanding the Potential Impact on Commercial Milk Products.

    PubMed

    Hakk, Heldur; Shappell, Nancy W; Lupton, Sara J; Shelver, Weilin L; Fanaselle, Wendy; Oryang, David; Yeung, Chi Yuen; Hoelzer, Karin; Ma, Yinqing; Gaalswyk, Dennis; Pouillot, Régis; Van Doren, Jane M

    2016-01-13

    Seven animal drugs [penicillin G (PENG), sulfadimethoxine (SDMX), oxytetracycline (OTET), erythromycin (ERY), ketoprofen (KETO), thiabendazole (THIA), and ivermectin (IVR)] were used to evaluate the drug distribution between milk fat and skim milk fractions of cow milk. More than 90% of the radioactivity was distributed into the skim milk fraction for ERY, KETO, OTET, PENG, and SDMX, approximately 80% for THIA, and 13% for IVR. The distribution of drug between milk fat and skim milk fractions was significantly correlated to the drug's lipophilicity (partition coefficient, log P, or distribution coefficient, log D, which includes ionization). Data were fit with linear mixed effects models; the best fit was obtained within this data set with log D versus observed drug distribution ratios. These candidate empirical models serve for assisting to predict the distribution and concentration of these drugs in a variety of milk and milk products.

  8. The algorithm for automatic detection of the calibration object

    NASA Astrophysics Data System (ADS)

    Artem, Kruglov; Irina, Ugfeld

    2017-06-01

    The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.

  9. Improved one-dimensional area law for frustration-free systems

    NASA Astrophysics Data System (ADS)

    Arad, Itai; Landau, Zeph; Vazirani, Umesh

    2012-05-01

    We present a new proof for the 1D area law for frustration-free systems with a constant gap, which exponentially improves the entropy bound in Hastingsâ 1D area law and which is tight to within a polynomial factor. For particles of dimension d, spectral gap ɛ>0, and interaction strength at most J, our entropy bound is S1D≤O(1)·X3log8X, where X=def(Jlogd)/ɛ. Our proof is completely combinatorial, combining the detectability lemma with basic tools from approximation theory. In higher dimensions, when the bipartitioning area is |∂L|, we use additional local structure in the proof and show that S≤O(1)·|∂L|2log6|∂L|·X3log8X. This is at the cusp of being nontrivial in the 2D case, in the sense that any further improvement would yield a subvolume law.

  10. Standardized likelihood ratio test for comparing several log-normal means and confidence interval for the common mean.

    PubMed

    Krishnamoorthy, K; Oral, Evrim

    2017-12-01

    Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.

  11. Mechanisms, predictors, and trends of electrical failure of Riata leads.

    PubMed

    Cheung, Jim W; Al-Kazaz, Mohamed; Thomas, George; Liu, Christopher F; Ip, James E; Bender, Seth R; Siddiqi, Faisal K; Markowitz, Steven M; Lerman, Bruce B

    2013-10-01

    Riata and Riata ST implantable cardioverter-defibrillator leads have been shown to be prone to structural and electrical failure. To determine predictors, mechanisms, and temporal patterns of Riata/ST lead electrical failure. All 314 patients who underwent Riata/ST lead implantation at our institution with greater than or equal to 90 days of follow-up were studied. The Kaplan-Meier analysis of lead survival was performed. Results from the returned product analysis of explanted leads with electrical lead failure were recorded. During a median follow-up of 4.1 years, the Riata lead electrical failure rate was 6.6%. The rate of externalized conductors among failed leads was 57%. The engineering analysis of 10 explanted leads revealed 5 (50%) leads with electrical failure owing to breach of ethylene tetrafluoroethylene conductor coating. Female gender (hazard ratio 2.7; 95% confidence interval 1.1-6.7; P = .04) and age (hazard ratio 0.95; 95% confidence interval 0.92-0.97; P < .001) were multivariate predictors of lead failure. By using log-log analysis, we noted that the rate of Riata lead failure initially increased exponentially with a power of 2.1 but leads surviving past 4 years had a linear pattern of lead failure with a power of 1.0. Younger age and female gender are independent predictors of Riata lead failure. Loss of integrity of conductor cables with ethylene tetrafluoroethylene coating is an important mode of electrical failure of the Riata lead. Further study of Riata lead failure trends is warranted to guide lead management. © 2013 Heart Rhythm Society. All rights reserved.

  12. Confined compressive strength analysis can improve PDC bit selection. [Polycrystalline Diamond Compact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fabain, R.T.

    1994-05-16

    A rock strength analysis program, through intensive log analysis, can quantify rock hardness in terms of confined compressive strength to identify intervals suited for drilling with polycrystalline diamond compact (PDC) bits. Additionally, knowing the confined compressive strength helps determine the optimum PDC bit for the intervals. Computing rock strength as confined compressive strength can more accurately characterize a rock's actual hardness downhole than other methods. the information can be used to improve bit selections and to help adjust drilling parameters to reduce drilling costs. Empirical data compiled from numerous field strength analyses have provided a guide to selecting PDC drillmore » bits. A computer analysis program has been developed to aid in PDC bit selection. The program more accurately defines rock hardness in terms of confined strength, which approximates the in situ rock hardness downhole. Unconfined compressive strength is rock hardness at atmospheric pressure. The program uses sonic and gamma ray logs as well as numerous input data from mud logs. Within the range of lithologies for which the program is valid, rock hardness can be determine with improved accuracy. The program's output is typically graphed in a log format displaying raw data traces from well logs, computer-interpreted lithology, the calculated values of confined compressive strength, and various optional rock mechanic outputs.« less

  13. Influence of environmental factors on activity patterns of Incisitermes minor (Isoptera: Kalotermitidae) in naturally infested logs.

    PubMed

    Lewis, Vernard R; Leighton, Shawn; Tabuchi, Robin; Baldwin, James A; Haverty, Michael I

    2013-02-01

    Acoustic emission (AE) activity patterns were measured from seven loquat [Eriobotrya japonica (Thunb.) Lindl.] logs, five containing live western drywood termite [Incisitermes minor (Hagen)] infestations, and two without an active drywood termite infestation. AE activity, as well as temperature, were monitored every 3 min under unrestricted ambient conditions in a small wooden building, under unrestricted ambient conditions but in constant darkness, or in a temperature-controlled cabined under constant darkness. Logs with active drywood termite infestations displayed similar diurnal cycles of AE activity that closely followed temperature with a peak of AE activity late in the afternoon (1700-1800 hours). When light was excluded from the building, a circadian pattern continued and apparently was driven by temperature. When the seven logs were kept at a relatively constant temperature (approximately 23 +/- 0.9 degrees C) and constant darkness, the pattern of activity was closely correlated with temperature, even with minimal changes in temperature. Temperature is the primary driver of activity of these drywood termites, but the effects are different when temperature is increasing or decreasing. At constant temperature, AE activity was highly correlated with the number of termites in the logs. The possible implications of these findings on our understanding of drywood termite biology and how this information may affect inspections and posttreatment evaluations are discussed.

  14. Use of alternative carrier materials in AOAC Official Method 2008.05, efficacy of liquid sporicides against spores of Bacillus subtilis on a hard, nonporous surface, quantitative three-step method.

    PubMed

    Tomasino, Stephen F; Rastogi, Vipin K; Wallace, Lalena; Smith, Lisa S; Hamilton, Martin A; Pines, Rebecca M

    2010-01-01

    The quantitative Three-Step Method (TSM) for testing the efficacy of liquid sporicides against spores of Bacillus subtilis on a hard, nonporous surface (glass) was adopted as AOAC Official Method 2008.05 in May 2008. The TSM uses 5 x 5 x 1 mm coupons (carriers) upon which spores have been inoculated and which are introduced into liquid sporicidal agent contained in a microcentrifuge tube. Following exposure of inoculated carriers and neutralization, spores are removed from carriers in three fractions (gentle washing, fraction A; sonication, fraction B; and gentle agitation, fraction C). Liquid from each fraction is serially diluted and plated on a recovery medium for spore enumeration. The counts are summed over the three fractions to provide the density (viable spores per carrier), which is log10-transformed to arrive at the log density. The log reduction is calculated by subtracting the mean log density for treated carriers from the mean log density for control carriers. This paper presents a single-laboratory investigation conducted to evaluate the applicability of using two porous carrier materials (ceramic tile and untreated pine wood) and one alternative nonporous material (stainless steel). Glass carriers were included in the study as the reference material. Inoculated carriers were evaluated against three commercially available liquid sporicides (sodium hypochlorite, a combination of peracetic acid and hydrogen peroxide, and glutaraldehyde), each at two levels of presumed efficacy (medium and high) to provide data for assessing the responsiveness of the TSM. Three coupons of each material were evaluated across three replications at each level; three replications of a control were required. Even though all carriers were inoculated with approximately the same number of spores, the observed counts of recovered spores were consistently higher for the nonporous carriers. For control carriers, the mean log densities for the four materials ranged from 6.63 for wood to 7.14 for steel. The pairwise differences between mean log densities, except for glass minus steel, were statistically significant (P < 0.001). The repeatability standard deviations (Sr) for the mean control log density per test were similar for the four materials, ranging from 0.08 for wood to 0.13 for tile. Spore recovery from the carrier materials ranged from approximately 20 to 70%: 20% (pine wood), 40% (ceramic tile), 55% (glass), and 70% (steel). Although the percent spore recovery from pine wood was significantly lower than that from other materials, the performance data indicate that the TSM provides a repeatable and responsive test for determining the efficacy of liquid sporicides on both porous and nonporous materials.

  15. Contributions of Optical and Non-Optical Blur to Variation in Visual Acuity

    PubMed Central

    McAnany, J. Jason; Shahidi, Mahnaz; Applegate, Raymond A.; Zelkha, Ruth; Alexander, Kenneth R.

    2011-01-01

    Purpose To determine the relative contributions of optical and non-optical sources of intrinsic blur to variations in visual acuity (VA) among normally sighted subjects. Methods Best-corrected VA of sixteen normally sighted subjects was measured using briefly presented (59 ms) tumbling E optotypes that were either unblurred or blurred through convolution with Gaussian functions of different widths. A standard model of intrinsic blur was used to estimate each subject’s equivalent intrinsic blur (σint) and VA for the unblurred tumbling E (MAR0). For 14 subjects, a radially averaged optical point spread function due to higher-order aberrations was derived by Shack-Hartmann aberrometry and fit with a Gaussian function. The standard deviation of the best-fit Gaussian function defined optical blur (σopt). An index of non-optical blur (η) was defined as: 1-σopt/σint. A control experiment was conducted on 5 subjects to evaluate the effect of stimulus duration on MAR0 and σint. Results Log MAR0 for the briefly presented E was correlated significantly with log σint (r = 0.95, p < 0.01), consistent with previous work. However, log MAR0 was not correlated significantly with log σopt (r = 0.46, p = 0.11). For subjects with log MAR0 equivalent to approximately 20/20 or better, log MAR0 was independent of log η, whereas for subjects with larger log MAR0 values, log MAR0 was proportional to log η. The control experiment showed a statistically significant effect of stimulus duration on log MAR0 (p < 0.01) but a non-significant effect on σint (p = 0.13). Conclusions The relative contributions of optical and non-optical blur to VA varied among the subjects, and were related to the subject’s VA. Evaluating optical and non-optical blur may be useful for predicting changes in VA following procedures that improve the optics of the eye in patients with both optical and non-optical sources of VA loss. PMID:21460756

  16. Time-Lapse Measurement of Wellbore Integrity

    NASA Astrophysics Data System (ADS)

    Duguid, A.

    2017-12-01

    Well integrity is becoming more important as wells are used longer or repurposed. For CO2, shale gas, and other projects it has become apparent that wells represent the most likely unintended migration pathway for fluids out of the reservoir. Comprehensive logging programs have been employed to determine the condition of legacy wells in North America. These studies provide examples of assessment technologies. Logging programs have included pulsed neutron logging, ultrasonic well mapping, and cement bond logging. While these studies provide examples of what can be measured, they have only conducted a single round of logging and cannot show if the well has changed over time. Recent experience with time-lapse logging of three monitoring wells at a US Department of Energy sponsored CO2 project has shown the full value of similar tools. Time-lapse logging has shown that well integrity changes over time can be identified. It has also shown that the inclusion of and location of monitoring technologies in the well and the choice of construction materials must be carefully considered. Two of the wells were approximately eight years old at the time of study; they were constructed with steel and fiberglass casing sections and had lines on the outside of the casing running to the surface. The third well was 68 years old when it was studied and was originally constructed as a production well. Repeat logs were collected six or eight years after initial logging. Time-lapse logging showed the evolution of the wells. The results identified locations where cement degraded over time and locations that showed little change. The ultrasonic well maps show clearly that the lines used to connect the monitoring technology to the surface are visible and have a local effect on cement isolation. Testing and sampling was conducted along with logging. It provided insight into changes identified in the time-lapse log results. Point permeability testing was used to provide an in-situ point estimate of the cement isolating capacity. Cased-hole sidewall cores in the steel and fiberglass casing sections allowed analysis of bulk cement and the cement at the casing- and formation-interface. This presentation will cover how time-lapse logging was conducted, how the results may be applicable to other wells, and how monitoring well design may affect wellbore integrity.

  17. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

    PubMed

    Kong, Shengchun; Nan, Bin

    2014-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses.

  18. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso

    PubMed Central

    Kong, Shengchun; Nan, Bin

    2013-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses. PMID:24516328

  19. Empirical behavior of a world stock index from intra-day to monthly time scales

    NASA Astrophysics Data System (ADS)

    Breymann, W.; Lüthi, D. R.; Platen, E.

    2009-10-01

    Most of the papers that study the distributional and fractal properties of financial instruments focus on stock prices or foreign exchange rates. This typically leads to mixed results concerning the distributions of log-returns and some multi-fractal properties of exchange rates, stock prices, and regional indices. This paper uses a well diversified world stock index as the central object of analysis. Such index approximates the growth optimal portfolio, which is demonstrated under the benchmark approach, it is the ideal reference unit for studying basic securities. When denominating this world index in units of a given currency, one measures the movements of the currency against the entire market. This provides a least disturbed observation of the currency dynamics. In this manner, one can expect to disentangle, e.g., the superposition of the two currencies involved in an exchange rate. This benchmark approach to the empirical analysis of financial data allows us to establish remarkable stylized facts. Most important is the observation that the repeatedly documented multi-fractal appearance of financial time series is very weak and much less pronounced than the deviation of the mono-scaling properties from Brownian-motion type scaling. The generalized Hurst exponent H(2) assumes typical values between 0.55 and 0.6. Accordingly, autocorrelations of log-returns decay according to a power law, and the quadratic variation vanishes when going to vanishing observation time step size. Furthermore, one can identify the Student t distribution as the log-return distribution of a well-diversified world stock index for long time horizons when a long enough data series is used for estimation. The study of dependence properties, finally, reveals that jumps at daily horizon originate primarily in the stock market while at 5min horizon they originate in the foreign exchange market. The principal message of the empirical analysis is that there is evidence that a diffusion model without multi-scaling could reasonably well model the dynamics of a broadly diversified world stock index. in here

  20. Electrical Resistivity Tomography and Ground Penetrating Radar for locating buried petrified wood sites: a case study in the natural monument of the Petrified Forest of Evros, Greece

    NASA Astrophysics Data System (ADS)

    Vargemezis, George; Diamanti, Nectaria; Tsourlos, Panagiotis; Fikos, Ilias

    2014-05-01

    A geophysical survey was carried out in the Petrified Forest of Evros, the northernmost regional unit of Greece. This collection of petrified wood has an age of approximately 35 million years and it is the oldest in Greece (i.e., older than the well-known Petrified Forest of Lesvos island located in the North Aegean Sea and which is possibly the largest of the petrified forests worldwide). Protection, development and maintenance projects still need to be carried out at the area despite all fears regarding the forest's fate since many petrified logs remain exposed both in weather conditions - leading to erosion - and to the public. This survey was conducted as part of a more extensive framework regarding the development and protection of this natural monument. Geophysical surveying has been chosen as a non-destructive investigation method since the area of application is both a natural ecosystem and part of cultural heritage. Along with electrical resistivity tomography (ERT), ground penetrating radar (GPR) surveys have been carried out for investigating possible locations of buried fossilized tree trunks. The geoelectrical sections derived from ERT data in combination with the GPR profiles provided a broad view of the subsurface. Two and three dimensional subsurface geophysical images of the surveyed area have been constructed, pointing out probable locations of petrified logs. Regarding ERT, petrified trunks have been detected as high resistive bodies, while lower resistivity values were more related to the surrounding geological materials. GPR surveying has also indicated buried petrified log locations. As these two geophysical methods are affected in different ways by the subsurface conditions, the combined use of both techniques enhanced our ability to produce more reliable interpretations of the subsurface. After the completion of the geophysical investigations of this first stage, petrified trunks were revealed after a subsequent excavation at indicated locations. Moreover, we identified possible buried petrified targets at locations yet to be excavated.

  1. Phylogenetic analyses suggest that diversification and body size evolution are independent in insects.

    PubMed

    Rainford, James L; Hofreiter, Michael; Mayhew, Peter J

    2016-01-08

    Skewed body size distributions and the high relative richness of small-bodied taxa are a fundamental property of a wide range of animal clades. The evolutionary processes responsible for generating these distributions are well described in vertebrate model systems but have yet to be explored in detail for other major terrestrial clades. In this study, we explore the macro-evolutionary patterns of body size variation across families of Hexapoda (insects and their close relatives), using recent advances in phylogenetic understanding, with an aim to investigate the link between size and diversity within this ancient and highly diverse lineage. The maximum, minimum and mean-log body lengths of hexapod families are all approximately log-normally distributed, consistent with previous studies at lower taxonomic levels, and contrasting with skewed distributions typical of vertebrate groups. After taking phylogeny and within-tip variation into account, we find no evidence for a negative relationship between diversification rate and body size, suggesting decoupling of the forces controlling these two traits. Likelihood-based modeling of the log-mean body size identifies distinct processes operating within Holometabola and Diptera compared with other hexapod groups, consistent with accelerating rates of size evolution within these clades, while as a whole, hexapod body size evolution is found to be dominated by neutral processes including significant phylogenetic conservatism. Based on our findings we suggest that the use of models derived from well-studied but atypical clades, such as vertebrates may lead to misleading conclusions when applied to other major terrestrial lineages. Our results indicate that within hexapods, and within the limits of current systematic and phylogenetic knowledge, insect diversification is generally unfettered by size-biased macro-evolutionary processes, and that these processes over large timescales tend to converge on apparently neutral evolutionary processes. We also identify limitations on available data within the clade and modeling approaches for the resolution of trees of higher taxa, the resolution of which may collectively enhance our understanding of this key component of terrestrial ecosystems.

  2. The South’s timber industry - an assessment of timber product output and use, 2009

    Treesearch

    Tony G. Johnson; James W. Bentley; Mike Howell

    2011-01-01

    In 2009, industrial roundwood output from the South’s forests totaled 6.56 billion cubic feet, 20 percent less than in 2007. Pulpwood was the leading roundwood product at 3.43 billion cubic feet; saw logs ranked second at 2.26 billion cubic feet; veneer logs were third at 384.4 million cubic feet. Total receipts declined 21 percent to 6.55 billion cubic feet. Mill...

  3. Meta-Analysis of the Reduction of Norovirus and Male-Specific Coliphage Concentrations in Wastewater Treatment Plants.

    PubMed

    Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R

    2015-07-01

    Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  4. A Trust-Based Secure Routing Scheme Using the Traceback Approach for Energy-Harvesting Wireless Sensor Networks.

    PubMed

    Tang, Jiawei; Liu, Anfeng; Zhang, Jian; Xiong, Neal N; Zeng, Zhiwen; Wang, Tian

    2018-03-01

    The Internet of things (IoT) is composed of billions of sensing devices that are subject to threats stemming from increasing reliance on communications technologies. A Trust-Based Secure Routing (TBSR) scheme using the traceback approach is proposed to improve the security of data routing and maximize the use of available energy in Energy-Harvesting Wireless Sensor Networks (EHWSNs). The main contributions of a TBSR are (a) the source nodes send data and notification to sinks through disjoint paths, separately; in such a mechanism, the data and notification can be verified independently to ensure their security. (b) Furthermore, the data and notification adopt a dynamic probability of marking and logging approach during the routing. Therefore, when attacked, the network will adopt the traceback approach to locate and clear malicious nodes to ensure security. The probability of marking is determined based on the level of battery remaining; when nodes harvest more energy, the probability of marking is higher, which can improve network security. Because if the probability of marking is higher, the number of marked nodes on the data packet routing path will be more, and the sink will be more likely to trace back the data packet routing path and find malicious nodes according to this notification. When data packets are routed again, they tend to bypass these malicious nodes, which make the success rate of routing higher and lead to improved network security. When the battery level is low, the probability of marking will be decreased, which is able to save energy. For logging, when the battery level is high, the network adopts a larger probability of marking and smaller probability of logging to transmit notification to the sink, which can reserve enough storage space to meet the storage demand for the period of the battery on low level; when the battery level is low, increasing the probability of logging can reduce energy consumption. After the level of battery remaining is high enough, nodes then send the notification which was logged before to the sink. Compared with past solutions, our results indicate that the performance of the TBSR scheme has been improved comprehensively; it can effectively increase the quantity of notification received by the sink by 20%, increase energy efficiency by 11%, reduce the maximum storage capacity needed by nodes by 33.3% and improve the success rate of routing by approximately 16.30%.

  5. A Trust-Based Secure Routing Scheme Using the Traceback Approach for Energy-Harvesting Wireless Sensor Networks

    PubMed Central

    Tang, Jiawei; Zhang, Jian; Zeng, Zhiwen; Wang, Tian

    2018-01-01

    The Internet of things (IoT) is composed of billions of sensing devices that are subject to threats stemming from increasing reliance on communications technologies. A Trust-Based Secure Routing (TBSR) scheme using the traceback approach is proposed to improve the security of data routing and maximize the use of available energy in Energy-Harvesting Wireless Sensor Networks (EHWSNs). The main contributions of a TBSR are (a) the source nodes send data and notification to sinks through disjoint paths, separately; in such a mechanism, the data and notification can be verified independently to ensure their security. (b) Furthermore, the data and notification adopt a dynamic probability of marking and logging approach during the routing. Therefore, when attacked, the network will adopt the traceback approach to locate and clear malicious nodes to ensure security. The probability of marking is determined based on the level of battery remaining; when nodes harvest more energy, the probability of marking is higher, which can improve network security. Because if the probability of marking is higher, the number of marked nodes on the data packet routing path will be more, and the sink will be more likely to trace back the data packet routing path and find malicious nodes according to this notification. When data packets are routed again, they tend to bypass these malicious nodes, which make the success rate of routing higher and lead to improved network security. When the battery level is low, the probability of marking will be decreased, which is able to save energy. For logging, when the battery level is high, the network adopts a larger probability of marking and smaller probability of logging to transmit notification to the sink, which can reserve enough storage space to meet the storage demand for the period of the battery on low level; when the battery level is low, increasing the probability of logging can reduce energy consumption. After the level of battery remaining is high enough, nodes then send the notification which was logged before to the sink. Compared with past solutions, our results indicate that the performance of the TBSR scheme has been improved comprehensively; it can effectively increase the quantity of notification received by the sink by 20%, increase energy efficiency by 11%, reduce the maximum storage capacity needed by nodes by 33.3% and improve the success rate of routing by approximately 16.30%. PMID:29494561

  6. The SAMI Galaxy Survey: gas content and interaction as the drivers of kinematic asymmetry

    NASA Astrophysics Data System (ADS)

    Bloom, J. V.; Croom, S. M.; Bryant, J. J.; Schaefer, A. L.; Bland-Hawthorn, J.; Brough, S.; Callingham, J.; Cortese, L.; Federrath, C.; Scott, N.; van de Sande, J.; D'Eugenio, F.; Sweet, S.; Tonini, C.; Allen, J. T.; Goodwin, M.; Green, A. W.; Konstantopoulos, I. S.; Lawrence, J.; Lorente, N.; Medling, A. M.; Owers, M. S.; Richards, S. N.; Sharp, R.

    2018-05-01

    In order to determine the causes of kinematic asymmetry in the Hα gas in the SAMI (Sydney-AAO Multi-object IFS) Galaxy Survey sample, we investigate the comparative influences of environment and intrinsic properties of galaxies on perturbation. We use spatially resolved Hα velocity fields from the SAMI Galaxy Survey to quantify kinematic asymmetry (\\overline{v_asym}) in nearby galaxies and environmental and stellar mass data from the Galaxy And Mass Assembly survey. We find that local environment, measured as distance to nearest neighbour, is inversely correlated with kinematic asymmetry for galaxies with log (M*/M⊙) > 10.0, but there is no significant correlation for galaxies with log (M*/M⊙) < 10.0. Moreover, low-mass galaxies [log (M*/M⊙) < 9.0] have greater kinematic asymmetry at all separations, suggesting a different physical source of asymmetry is important in low-mass galaxies. We propose that secular effects derived from gas fraction and gas mass may be the primary causes of asymmetry in low-mass galaxies. High gas fraction is linked to high σ _m/V (where σm is Hα velocity dispersion and V the rotation velocity), which is strongly correlated with \\overline{v_asym}, and galaxies with log (M*/M⊙) < 9.0 have offset \\overline{σ _m/V} from the rest of the sample. Further, asymmetry as a fraction of dispersion decreases for galaxies with log (M*/M⊙) < 9.0. Gas mass and asymmetry are also inversely correlated in our sample. We propose that low gas masses in dwarf galaxies may lead to asymmetric distribution of gas clouds, leading to increased relative turbulence.

  7. Log-layer mismatch and modeling of the fluctuating wall stress in wall-modeled large-eddy simulations

    NASA Astrophysics Data System (ADS)

    Yang, Xiang I. A.; Park, George Ilhwan; Moin, Parviz

    2017-10-01

    Log-layer mismatch refers to a chronic problem found in wall-modeled large-eddy simulation (WMLES) or detached-eddy simulation, where the modeled wall-shear stress deviates from the true one by approximately 15 % . Many efforts have been made to resolve this mismatch. The often-used fixes, which are generally ad hoc, include modifying subgrid-scale stress models, adding a stochastic forcing, and moving the LES-wall-model matching location away from the wall. An analysis motivated by the integral wall-model formalism suggests that log-layer mismatch is resolved by the built-in physics-based temporal filtering. In this work we investigate in detail the effects of local filtering on log-layer mismatch. We show that both local temporal filtering and local wall-parallel filtering resolve log-layer mismatch without moving the LES-wall-model matching location away from the wall. Additionally, we look into the momentum balance in the near-wall region to provide an alternative explanation of how LLM occurs, which does not necessarily rely on the numerical-error argument. While filtering resolves log-layer mismatch, the quality of the wall-shear stress fluctuations predicted by WMLES does not improve with our remedy. The wall-shear stress fluctuations are highly underpredicted due to the implied use of LES filtering. However, good agreement can be found when the WMLES data are compared to the direct numerical simulation data filtered at the corresponding WMLES resolutions.

  8. Disturbance impacts on understory plant communities of the Colorado Front Range

    Treesearch

    Paula J. Fornwalt

    2009-01-01

    Pinus ponderosa - Pseudotsuga menziesii (ponderosa pine - Douglas-fir) forests of the Colorado Front Range have experienced a range of disturbances since they were settled by European-Americans approximately 150 years ago, including settlement-era logging and domestic grazing, and more recently, wildfire. In this dissertation, I...

  9. Real-Time PCR Quantification Using A Variable Reaction Efficiency Model

    PubMed Central

    Platts, Adrian E.; Johnson, Graham D.; Linnemann, Amelia K.; Krawetz, Stephen A.

    2008-01-01

    Quantitative real-time PCR remains a cornerstone technique in gene expression analysis and sequence characterization. Despite the importance of the approach to experimental biology the confident assignment of reaction efficiency to the early cycles of real-time PCR reactions remains problematic. Considerable noise may be generated where few cycles in the amplification are available to estimate peak efficiency. An alternate approach that uses data from beyond the log-linear amplification phase is explored with the aim of reducing noise and adding confidence to efficiency estimates. PCR reaction efficiency is regressed to estimate the per-cycle profile of an asymptotically departed peak efficiency, even when this is not closely approximated in the measurable cycles. The process can be repeated over replicates to develop a robust estimate of peak reaction efficiency. This leads to an estimate of the maximum reaction efficiency that may be considered primer-design specific. Using a series of biological scenarios we demonstrate that this approach can provide an accurate estimate of initial template concentration. PMID:18570886

  10. Observed, unknown distributions of clinical chemical quantities should be considered to be log-normal: a proposal.

    PubMed

    Haeckel, Rainer; Wosniok, Werner

    2010-10-01

    The distribution of many quantities in laboratory medicine are considered to be Gaussian if they are symmetric, although, theoretically, a Gaussian distribution is not plausible for quantities that can attain only non-negative values. If a distribution is skewed, further specification of the type is required, which may be difficult to provide. Skewed (non-Gaussian) distributions found in clinical chemistry usually show only moderately large positive skewness (e.g., log-normal- and χ(2) distribution). The degree of skewness depends on the magnitude of the empirical biological variation (CV(e)), as demonstrated using the log-normal distribution. A Gaussian distribution with a small CV(e) (e.g., for plasma sodium) is very similar to a log-normal distribution with the same CV(e). In contrast, a relatively large CV(e) (e.g., plasma aspartate aminotransferase) leads to distinct differences between a Gaussian and a log-normal distribution. If the type of an empirical distribution is unknown, it is proposed that a log-normal distribution be assumed in such cases. This avoids distributional assumptions that are not plausible and does not contradict the observation that distributions with small biological variation look very similar to a Gaussian distribution.

  11. The Quasar Fraction in Low-Frequency Selected Complete Samples and Implications for Unified Schemes

    NASA Technical Reports Server (NTRS)

    Willott, Chris J.; Rawlings, Steve; Blundell, Katherine M.; Lacy, Mark

    2000-01-01

    Low-frequency radio surveys are ideal for selecting orientation-independent samples of extragalactic sources because the sample members are selected by virtue of their isotropic steep-spectrum extended emission. We use the new 7C Redshift Survey along with the brighter 3CRR and 6C samples to investigate the fraction of objects with observed broad emission lines - the 'quasar fraction' - as a function of redshift and of radio and narrow emission line luminosity. We find that the quasar fraction is more strongly dependent upon luminosity (both narrow line and radio) than it is on redshift. Above a narrow [OII] emission line luminosity of log(base 10) (L(sub [OII])/W) approximately > 35 [or radio luminosity log(base 10) (L(sub 151)/ W/Hz.sr) approximately > 26.5], the quasar fraction is virtually independent of redshift and luminosity; this is consistent with a simple unified scheme with an obscuring torus with a half-opening angle theta(sub trans) approximately equal 53 deg. For objects with less luminous narrow lines, the quasar fraction is lower. We show that this is not due to the difficulty of detecting lower-luminosity broad emission lines in a less luminous, but otherwise similar, quasar population. We discuss evidence which supports at least two probable physical causes for the drop in quasar fraction at low luminosity: (i) a gradual decrease in theta(sub trans) and/or a gradual increase in the fraction of lightly-reddened (0 approximately < A(sub V) approximately < 5) lines-of-sight with decreasing quasar luminosity; and (ii) the emergence of a distinct second population of low luminosity radio sources which, like M8T, lack a well-fed quasar nucleus and may well lack a thick obscuring torus.

  12. Assessment of the hygienic performances of hamburger patty production processes.

    PubMed

    Gill, C O; Rahn, K; Sloan, K; McMullen, L M

    1997-05-20

    The hygienic conditions of the hamburger patties collected from three patty manufacturing plants and six retail outlets were examined. At each manufacturing plant a sample from newly formed, chilled patties and one from frozen patties were collected from each of 25 batches of patties selected at random. At three, two or one retail outlet, respectively, 25 samples from frozen, chilled or both frozen and chilled patties were collected at random. Each sample consisted of 30 g of meat obtained from five or six patties. Total aerobic, coliform and Escherichia coli counts per gram were enumerated for each sample. The mean log (x) and standard deviation (s) were calculated for the log10 values for each set of 25 counts, on the assumption that the distribution of counts approximated the log normal. A value for the log10 of the arithmetic mean (log A) was calculated for each set from the values of x and s. A chi2 statistic was calculated for each set as a test of the assumption of the log normal distribution. The chi2 statistic was calculable for 32 of the 39 sets. Four of the sets gave chi2 values indicative of gross deviation from log normality. On inspection of those sets, distributions obviously differing from the log normal were apparent in two. Log A values for total, coliform and E. coli counts for chilled patties from manufacturing plants ranged from 4.4 to 5.1, 1.7 to 2.3 and 0.9 to 1.5, respectively. Log A values for frozen patties from manufacturing plants were between < 0.1 and 0.5 log10 units less than the equivalent values for chilled patties. Log A values for total, coliform and E. coli counts for frozen patties on retail sale ranged from 3.8 to 8.5, < 0.5 to 3.6 and < 0 to 1.9, respectively. The equivalent ranges for chilled patties on retail sale were 4.8 to 8.5, 1.8 to 3.7 and 1.4 to 2.7, respectively. The findings indicate that the general hygienic condition of hamburgers patties could be improved by their being manufactured from only manufacturing beef of superior hygienic quality, and by the better management of chilled patties at retail outlets.

  13. Automated Detection of Selective Logging in Amazon Forests Using Airborne Lidar Data and Pattern Recognition Algorithms

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; d'Oliveira, M. N.; Takemura, C. M.; Vitoria, D.; Araujo, L. S.; Morton, D. C.

    2012-12-01

    Selective logging, the removal of several valuable timber trees per hectare, is an important land use in the Brazilian Amazon and may degrade forests through long term changes in structure, loss of forest carbon and species diversity. Similar to deforestation, the annual area affected by selected logging has declined significantly in the past decade. Nonetheless, this land use affects several thousand km2 per year in Brazil. We studied a 1000 ha area of the Antimary State Forest (FEA) in the State of Acre, Brazil (9.304 ○S, 68.281 ○W) that has a basal area of 22.5 m2 ha-1 and an above-ground biomass of 231 Mg ha-1. Logging intensity was low, approximately 10 to 15 m3 ha-1. We collected small-footprint airborne lidar data using an Optech ALTM 3100EA over the study area once each in 2010 and 2011. The study area contained both recent and older logging that used both conventional and technologically advanced logging techniques. Lidar return density averaged over 20 m-2 for both collection periods with estimated horizontal and vertical precision of 0.30 and 0.15 m. A relative density model comparing returns from 0 to 1 m elevation to returns in 1-5 m elevation range revealed the pattern of roads and skid trails. These patterns were confirmed by ground-based GPS survey. A GIS model of the road and skid network was built using lidar and ground data. We tested and compared two pattern recognition approaches used to automate logging detection. Both segmentation using commercial eCognition segmentation and a Frangi filter algorithm identified the road and skid trail network compared to the GIS model. We report on the effectiveness of these two techniques.

  14. Solubility enhancement of dioxins and PCBs by surfactant monomers and micelles quantified with polymer depletion techniques.

    PubMed

    Schacht, Veronika J; Grant, Sharon C; Escher, Beate I; Hawker, Darryl W; Gaus, Caroline

    2016-06-01

    Partitioning of super-hydrophobic organic contaminants (SHOCs) to dissolved or colloidal materials such as surfactants can alter their behaviour by enhancing apparent aqueous solubility. Relevant partition constants are, however, challenging to quantify with reasonable accuracy. Partition constants to colloidal surfactants can be measured by introducing a polymer (PDMS) as third phase with known PDMS-water partition constant in combination with the mass balance approach. We quantified partition constants of PCBs and PCDDs (log KOW 5.8-8.3) between water and sodium dodecyl sulphate monomers (KMO) and micelles (KMI). A refined, recently introduced swelling-based polymer loading technique allowed highly precise (4.5-10% RSD) and fast (<24 h) loading of SHOCs into PDMS, and due to the miniaturisation of batch systems equilibrium was reached in <5 days for KMI and <3 weeks for KMO. SHOC losses to experimental surfaces were substantial (8-26%) in monomer solutions, but had a low impact on KMO (0.10-0.16 log units). Log KMO for PCDDs (4.0-5.2) were approximately 2.6 log units lower than respective log KMI, which ranged from 5.2 to 7.0 for PCDDs and 6.6-7.5 for PCBs. The linear relationship between log KMI and log KOW was consistent with more polar and moderately hydrophobic compounds. Apparent solubility increased with increasing hydrophobicity and was highest in micelle solutions. However, this solubility enhancement was also considerable in monomer solutions, up to 200 times for OCDD. Given the pervasive presence of surfactant monomers in typical field scenarios, these data suggest that low surfactant concentrations may be effective long-term facilitators for subsurface transport of SHOCs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Efficacy of chlorine, acidic electrolyzed water and aqueous chlorine dioxide solutions to decontaminate Escherichia coli O157:H7 from lettuce leaves.

    PubMed

    Keskinen, Lindsey A; Burke, Angela; Annous, Bassam A

    2009-06-30

    This study compared the efficacy of chlorine (20-200 ppm), acidic electrolyzed water (50 ppm chlorine, pH 2.6), acidified sodium chlorite (20-200 ppm chlorite ion concentration, Sanova), and aqueous chlorine dioxide (20-200 ppm chlorite ion concentration, TriNova) washes in reducing populations of Escherichia coli O157:H7 on artificially inoculated lettuce. Fresh-cut leaves of Romaine or Iceberg lettuce were inoculated by immersion in water containing E. coli O157:H7 (8 log CFU/ml) for 5 min and dried in a salad spinner. Leaves (25 g) were then washed for 2 min, immediately or following 24 h of storage at 4 degrees C. The washing treatments containing chlorite ion concentrations of 100 and 200 ppm were the most effective against E. coli O157:H7 populations on Iceberg lettuce, with log reductions as high as 1.25 log CFU/g and 1.05 log CFU/g for TriNova and Sanova wash treatments, respectively. All other wash treatments resulted in population reductions of less than 1 log CFU/g. Chlorine (200 ppm), TriNova, Sanova, and acidic electrolyzed water were all equally effective against E. coli O157:H7 on Romaine, with log reductions of approximately 1 log CFU/g. The 20 ppm chlorine wash was as effective as the deionized water wash in reducing populations of E. coli O157:H7 on Romaine and Iceberg lettuce. Scanning electron microscopy indicated that E. coli O157:H7 that was incorporated into biofilms or located in damage lettuce tissue remained on the lettuce leaf, while individual cells on undamaged leaf surfaces were more likely to be washed away.

  16. Salvage logging, ecosystem processes, and biodiversity conservation.

    PubMed

    Lindenmayer, D B; Noss, R F

    2006-08-01

    We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case.

  17. Separate-channel analysis of two-channel microarrays: recovering inter-spot information.

    PubMed

    Smyth, Gordon K; Altman, Naomi S

    2013-05-26

    Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.

  18. Crowded letter and crowded picture logMAR acuity in children with amblyopia: a quantitative comparison.

    PubMed

    O'Boyle, Cathy; Chen, Sean I; Little, Julie-Anne

    2017-04-01

    Clinically, picture acuity tests are thought to overestimate visual acuity (VA) compared with letter tests, but this has not been systematically investigated in children with amblyopia. This study compared VA measurements with the LogMAR Crowded Kay Picture test to the LogMAR Crowded Keeler Letter acuity test in a group of young children with amblyopia. 58 children (34 male) with amblyopia (22 anisometropic, 18 strabismic and 18 with both strabismic/anisometropic amblyopia) aged 4-6 years (mean=68.7, range=48-83 months) underwent VA measurements. VA chart testing order was randomised, but the amblyopic eye was tested before the fellow eye. All participants wore up-to-date refractive correction. The Kay Picture test significantly overestimated VA by 0.098 logMAR (95% limits of agreement (LOA), 0.13) in the amblyopic eye and 0.088 logMAR (95% LOA, 0.13) in the fellow eye, respectively (p<0.001). No interactions were found from occlusion therapy, refractive correction or type of amblyopia on VA results (p>0.23). For both the amblyopic and fellow eyes, Bland-Altman plots demonstrated a systematic and predictable difference between Kay Picture and Keeler Letter charts across the range of acuities tested (Keeler acuity: amblyopic eye 0.75 to -0.05 logMAR; fellow eye 0.45 to -0.15 logMAR). Linear regression analysis (p<0.00001) and also slope values close to one (amblyopic 0.98, fellow 0.86) demonstrate that there is no proportional bias. The Kay Picture test consistently overestimated VA by approximately 0.10 logMAR when compared with the Keeler Letter test in young children with amblyopia. Due to the predictable difference found between both crowded logMAR acuity tests, it is reasonable to adjust Kay Picture acuity thresholds by +0.10 logMAR to compute expected Keeler Letter acuity scores. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Bias and Variance Approximations for Estimators of Extreme Quantiles

    DTIC Science & Technology

    1988-11-01

    r u - g(u). The errors of these approximations are, respectively, O ...The conditions required for this are yrci, yr+ypci. Taking the special cases r -1, r -1 and the limit r -) O , we deduce Jelog g(Y) 6 2folog g(Y) ~ e( 3+2y...a 2 (log g(TipL, o , o )) - I + I- exp-a" a a r - (- + Z - Ze - Z + (Z 2 - z~eZ + Z3 e - Z) + 0(y 2 )) 2 18 and using the formula E[Zre- sz1 - (_-) r r ( r

  20. Analysis of the Hessian for Inverse Scattering Problems. Part 3. Inverse Medium Scattering of Electromagnetic Waves in Three Dimensions

    DTIC Science & Technology

    2012-08-01

    small data noise and model error, the discrete Hessian can be approximated by a low-rank matrix. This in turn enables fast solution of an appropriately...implication of the compactness of the Hessian is that for small data noise and model error, the discrete Hessian can be approximated by a low-rank matrix. This...probability distribution is given by the inverse of the Hessian of the negative log likelihood function. For Gaussian data noise and model error, this

  1. Evaluation of the effects of ultraviolet light on bacterial contaminants inoculated into whole milk and colostrum, and on colostrum immunoglobulin G

    PubMed Central

    Pereira, R. V.; Bicalho, M. L.; Machado, V. S.; Lima, S.; Teixeira, A. G.; Warnick, L. D.; Bicalho, R. C.

    2015-01-01

    Raw milk and colostrum can harbor dangerous micro-organisms that can pose serious health risks for animals and humans. According to the USDA, more than 58% of calves in the United States are fed unpasteurized milk. The aim of this study was to evaluate the effect of UV light on reduction of bacteria in milk and colostrum, and on colostrum IgG. A pilot-scale UV light continuous (UVC) flow-through unit (45 J/cm2) was used to treat milk and colostrum. Colostrum and sterile whole milk were inoculated with Listeria innocua, Mycobacterium smegmatis, Salmonella serovar Typhimurium, Escherichia coli, Staphylococcus aureus, Streptococcus agalactiae, and Acinetobacter baumannii before being treated with UVC. During UVC treatment, samples were collected at 5 time points and bacteria were enumerated using selective media. The effect of UVC on IgG was evaluated using raw colostrum from a nearby dairy farm without the addition of bacteria. For each colostrum batch, samples were collected at several different time points and IgG was measured using ELISA. The UVC treatment of milk resulted in a significant final count (log cfu/mL) reduction of Listeria monocytogenes (3.2 ± 0.3 log cfu/mL reduction), Salmonella spp. (3.7 ± 0.2 log cfu/mL reduction), Escherichia coli (2.8 ± 0.2 log cfu/mL reduction), Staph. aureus (3.4 ± 0.3 log cfu/mL reduction), Streptococcus spp. (3.4 ± 0.4 log cfu/mL reduction), and A. baumannii (2.8 ± 0.2 log cfu/mL reduction). The UVC treatment of milk did not result in a significant final count (log cfu/mL) reduction for M. smegmatis (1.8 ± 0.5 log cfu/mL reduction). The UVC treatment of colostrum was significantly associated with a final reduction of bacterial count (log cfu/mL) of Listeria spp. (1.4 ± 0.3 log cfu/mL reduction), Salmonella spp. (1.0 ± 0.2 log cfu/mL reduction), and Acinetobacter spp. (1.1 ± 0.3 log cfu/mL reduction), but not of E. coli (0.5 ± 0.3 log cfu/mL reduction), Strep. agalactiae (0.8 ± 0.2 log cfu/mL reduction), and Staph. aureus (0.4 ± 0.2 log cfu/mL reduction). The UVC treatment of colostrum significantly decreased the IgG concentration, with an observed final mean IgG reduction of approximately 50%. Development of new methods to reduce bacterial contaminants in colostrum must take into consideration the barriers imposed by its opacity and organic components, and account for the incidental damage to IgG caused by manipulating colostrum. PMID:24582452

  2. Response Strength in Extreme Multiple Schedules

    PubMed Central

    McLean, Anthony P; Grace, Randolph C; Nevin, John A

    2012-01-01

    Four pigeons were trained in a series of two-component multiple schedules. Reinforcers were scheduled with random-interval schedules. The ratio of arranged reinforcer rates in the two components was varied over 4 log units, a much wider range than previously studied. When performance appeared stable, prefeeding tests were conducted to assess resistance to change. Contrary to the generalized matching law, logarithms of response ratios in the two components were not a linear function of log reinforcer ratios, implying a failure of parameter invariance. Over a 2 log unit range, the function appeared linear and indicated undermatching, but in conditions with more extreme reinforcer ratios, approximate matching was observed. A model suggested by McLean (1991), originally for local contrast, predicts these changes in sensitivity to reinforcer ratios somewhat better than models by Herrnstein (1970) and by Williams and Wixted (1986). Prefeeding tests of resistance to change were conducted at each reinforcer ratio, and relative resistance to change was also a nonlinear function of log reinforcer ratios, again contrary to conclusions from previous work. Instead, the function suggests that resistance to change in a component may be determined partly by the rate of reinforcement and partly by the ratio of reinforcers to responses. PMID:22287804

  3. Comparative study of the failure rates among 3 implantable defibrillator leads.

    PubMed

    van Malderen, Sophie C H; Szili-Torok, Tamas; Yap, Sing C; Hoeks, Sanne E; Zijlstra, Felix; Theuns, Dominic A M J

    2016-12-01

    After the introduction of the Biotronik Linox S/SD high-voltage lead, several cases of early failure have been observed. The purpose of this article was to assess the performance of the Linox S/SD lead in comparison to 2 other contemporary leads. We used the prospective Erasmus MC ICD registry to identify all implanted Linox S/SD (n = 408), Durata (St. Jude Medical, model 7122) (n = 340), and Endotak Reliance (Boston Scientific, models 0155, 0138, and 0158) (n = 343) leads. Lead failure was defined by low- or high-voltage impedance, failure to capture, sense or defibrillate, or the presence of nonphysiological signals not due to external interference. During a median follow-up of 5.1 years, 24 Linox (5.9%), 5 Endotak (1.5%), and 5 Durata (1.5%) leads failed. At 5-year follow-up, the cumulative failure rate of Linox leads (6.4%) was higher than that of Endotak (0.4%; P < .0001) and Durata (2.0%; P = .003) leads. The incidence rate was higher in Linox leads (1.3 per 100 patient-years) than in Endotak and Durata leads (0.2 and 0.3 per 100 patient-years, respectively; P < .001). A log-log analysis of the cumulative hazard for Linox leads functioning at 3-year follow-up revealed a stable failure rate of 3% per year. The majority of failures consisted of noise (62.5%) and abnormal impedance (33.3%). This study demonstrates a higher failure rate of Linox S/SD high-voltage leads compared to contemporary leads. Although the mechanism of lead failure is unclear, the majority presents with abnormal electrical parameters. Comprehensive monitoring of Linox S/SD high-voltage leads includes remote monitoring to facilitate early detection of lead failure. Copyright © 2016. Published by Elsevier Inc.

  4. EUVE observations of Algol: Detection of a continuum and implications for the coronal (Fe/H) abundance

    NASA Technical Reports Server (NTRS)

    Stern, Robert A.; Lemen, James R.; Schmitt, Jurgen H. M. M.; Pye, John P.

    1995-01-01

    We report results from the first extreme ultraviolet spectrum of the prototypical eclipsing binary Algol (beta Per), obtained with the spectrometers on the Extreme Ultraviolet Explorer (EUVE). The Algol spectrum in the 80-350 A range is dominated by emission lines of Fe XVI-XXIV, and the He II 304 A line. The Fe emission is characteristic of high-temperature plasma at temperatures up to at least log T approximately 7.3 K. We have successfully modeled the observed quiescent spectrum using a continuous emission measure distribution with the bulk of the emitting material at log T greater than 6.5. We are able to adequately fit both the coronal lines and continuum data with a cosmic abundance plasma, but only if Algol's quiescent corona is dominated by material at log T greater than 7.5, which is physically ruled out by prior X-ray observations of the quiescent Algol spectrum. Since the coronal (Fe/H) abundance is the principal determinant of the line-to-continuum ratio in the EUV, allowing the abundance to be a free parameter results in models with a range of best-fit abundances approximately = 15%-40% of solar photospheric (Fe/H). Since Algol's photospheric (Fe/H) appears to be near-solar, the anomalous EUV line-to-continuum ratio could either be the result of element segregation in the coronal formation process, or other, less likely mechanisms that may enhance the continuum with respect to the lines.

  5. Case-Deletion Diagnostics for Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Lu, Bin

    2003-01-01

    In this article, a case-deletion procedure is proposed to detect influential observations in a nonlinear structural equation model. The key idea is to develop the diagnostic measures based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm. An one-step pseudo approximation is proposed to reduce the…

  6. Dynamic predictive model for growth of Bacillus cereus from spores in cooked beans

    USDA-ARS?s Scientific Manuscript database

    Kinetic growth data of Bacillus cereus from spores in cooked beans at several isothermal conditions (between 10 to 49C) were collected. Samples were inoculated with approximately 2 log CFU/g of heat-shocked (80C/10 min) spores and stored at isothermal temperatures. B. cereus populations were deter...

  7. Inference with minimal Gibbs free energy in information field theory.

    PubMed

    Ensslin, Torsten A; Weig, Cornelius

    2010-11-01

    Non-linear and non-gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a gaussian signal with unknown spectrum, and (iii) inference of a poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-gaussian posterior.

  8. Geothermal state and fluid flow within ODP Hole 843B: results from wireline logging

    NASA Astrophysics Data System (ADS)

    Wiggins, Sean M.; Hildebrand, John A.; Gieskes, Joris M.

    2002-02-01

    Borehole fluid temperatures were measured with a wireline re-entry system in Ocean Drilling Program Hole 843B, the site of the Ocean Seismic Network Pilot Experiment. These temperature data, recorded more than 7 years after drilling, are compared to temperature data logged during Leg 136, approximately 1 day after drilling had ceased. Qualitative interpretations of the temperature data suggest that fluid flowed slowly downward in the borehole immediately following drilling, and flowed slowly upward 7 years after drilling. Quantitative analysis suggests that the upward fluid flow rate in the borehole is approximately 1 m/h. Slow fluid flow interpreted from temperature data only, however, requires estimates of other unmeasured physical properties. If fluid flows upward in Hole 843B, it may have led to undesirable noise for the borehole seismometer emplaced in this hole as part of the Ocean Seismic Network Pilot Experiment. Estimates of conductive heat flow from ODP Hole 843B are 51 mW/m 2 for the sediment and the basalt. These values are lower than the most recent Hawaiian Arch seafloor heat flow studies.

  9. Transformation Model Choice in Nonlinear Regression Analysis of Fluorescence-based Serial Dilution Assays

    PubMed Central

    Fong, Youyi; Yu, Xuesong

    2016-01-01

    Many modern serial dilution assays are based on fluorescence intensity (FI) readouts. We study optimal transformation model choice for fitting five parameter logistic curves (5PL) to FI-based serial dilution assay data. We first develop a generalized least squares-pseudolikelihood type algorithm for fitting heteroscedastic logistic models. Next we show that the 5PL and log 5PL functions can approximate each other well. We then compare four 5PL models with different choices of log transformation and variance modeling through a Monte Carlo study and real data. Our findings are that the optimal choice depends on the intended use of the fitted curves. PMID:27642502

  10. Lognormal-like statistics of a stochastic squeeze process

    NASA Astrophysics Data System (ADS)

    Shapira, Dekel; Cohen, Doron

    2017-10-01

    We analyze the full statistics of a stochastic squeeze process. The model's two parameters are the bare stretching rate w and the angular diffusion coefficient D . We carry out an exact analysis to determine the drift and the diffusion coefficient of log(r ) , where r is the radial coordinate. The results go beyond the heuristic lognormal description that is implied by the central limit theorem. Contrary to the common "quantum Zeno" approximation, the radial diffusion is not simply Dr=(1 /8 ) w2/D but has a nonmonotonic dependence on w /D . Furthermore, the calculation of the radial moments is dominated by the far non-Gaussian tails of the log(r ) distribution.

  11. Influence of host age on critical fitness parameters of Spathius galinae (Hymenoptera: Braconidae), a new parasitoid of the emerald ash borer (Coleoptera: Buprestidae).

    PubMed

    Watt, Timothy J; Duan, Jian J

    2014-08-01

    Spathius galinae Belokobylskij and Strazenac (Hymenoptera: Braconidae) is a recently discovered gregarious idiobiont larval ectoparasitoid currently being evaluated for biological control against the invasive emerald ash borer, Agrilus planipennis Fairmaire (Coleoptera: Buprestidae) in the United States. To aid in the development of laboratory rearing protocols, we assessed the influence of various emerald ash borer stages on critical fitness parameters of S. galinae. We exposed gravid S. galinae females to emerald ash borer host larvae of various ages (3.5, 5, 7, and 10 wk post egg oviposition) that were reared naturally in tropical (evergreen) ash (Fraxinus uhdei (Wenzig) Lingelsh) logs, or to field-collected, late-stage emerald ash borers (nonfeeding J-shaped larvae termed "J-larvae," prepupae, and pupae) that were artificially inserted into green ash logs. When exposed to larvae in tropical ash logs, S. galinae attacked 5 and 7 wk hosts more frequently (68-76%) than 3.5 wk (23%) and 10 wk (12%) hosts. Subsample dissections of the these logs revealed that 3.5, 5, 7 and 10 wk host logs contained mostly second, third, fourth, and J-larvae, respectively, that had already bored into the sapwood for diapause. No J-larvae were attacked by S. galinae when naturally reared in tropical ash logs. When parasitized by S. galinae, 7 and 10 wk hosts produced the largest broods (approximately 6.7 offspring per parasitized host), and the progenies that emerged from these logs had larger anatomical measurements and more female-biased sex ratios. When exposed to emerald ash borer J-larvae, prepupae, or pupae artificially inserted into green ash logs, S. galinae attacked 53% ofJ-larvae, but did not attack any prepupae or pupae. We conclude that large (fourth instar) emerald ash borer larvae should be used to rear S. galinae.

  12. Objective straylight assessment of the human eye with a novel device

    NASA Astrophysics Data System (ADS)

    Schramm, Stefan; Schikowski, Patrick; Lerm, Elena; Kaeding, André; Klemm, Matthias; Haueisen, Jens; Baumgarten, Daniel

    2016-03-01

    Forward scattered light from the anterior segment of the human eye can be measured by Shack-Hartmann (SH) wavefront aberrometers with limited visual angle. We propose a novel Point Spread Function (PSF) reconstruction algorithm based on SH measurements with a novel measurement devise to overcome these limitations. In our optical setup, we use a Digital Mirror Device as variable field stop, which is conventionally a pinhole suppressing scatter and reflections. Images with 21 different stop diameters were captured and from each image the average subaperture image intensity and the average intensity of the pupil were computed. The 21 intensities represent integral values of the PSF which is consequently reconstructed by derivation with respect to the visual angle. A generalized form of the Stiles-Holladay-approximation is fitted to the PSF resulting in a stray light parameter Log(IS). Additionaly the transmission loss of eye is computed. For the proof of principle, a study on 13 healthy young volunteers was carried out. Scatter filters were positioned in front of the volunteer's eye during C-Quant and scatter measurements to generate straylight emulating scatter in the lens. The straylight parameter is compared to the C-Quant measurement parameter Log(ISC) and scatter density of the filters SDF with a partial correlation. Log(IS) shows significant correlation with the SDF and Log(ISC). The correlation is more prominent between Log(IS) combined with the transmission loss and the SDF and Log(ISC). Our novel measurement and reconstruction technique allow for objective stray light analysis of visual angles up to 4 degrees.

  13. Physiological and morphological responses of pine and willow saplings to post-fire salvage logging

    NASA Astrophysics Data System (ADS)

    Millions, E. L.; Letts, M. G.; Harvey, T.; Rood, S. B.

    2015-12-01

    With global warming, forest fires may be increasing in frequency, and post-fire salvage logging may become more common. The ecophysiological impacts of this practice on tree saplings remain poorly understood. In this study, we examined the physiological and morphological impacts of increased light intensity, due to post-fire salvage logging, on the conifer Pinus contorta (pine) and deciduous broadleaf Salix lucida (willow) tree and shrub species in the Crowsnest Pass region of southern Alberta. Photosynthetic gas-exchange and plant morphological measurements were taken throughout the summer of 2013 on approximately ten year-old saplings of both species. Neither species exhibited photoinhibition, but different strategies were observed to acclimate to increased light availability. Willow saplings were able to slightly elevate their light-saturated rate of net photosynthesis (Amax) when exposed to higher photosynthetic photon flux density (PPFD), thus increasing their growth rate. Willow also exhibited increased leaf inclination angles and leaf mass per unit area (LMA), to decrease light interception in the salvage-logged plot. By contrast, pine, which exhibited lower Amax and transpiration (E), but higher water-use efficiency (WUE = Amax/E) than willow, increased the rate at which electrons were moved through and away from the photosynthetic apparatus in order to avoid photoinhibition. Acclimation indices were higher in willow saplings, consistent with the hypothesis that species with short-lived foliage exhibit greater acclimation. LMA was higher in pine saplings growing in the logged plot, but whole-plant and branch-level morphological acclimation was limited and more consistent with a response to decreased competition in the logged plot, which had much lower stand density.

  14. Turbidity Responses from Timber Harvesting, Wildfire, and Post-Fire Logging in the Battle Creek Watershed, Northern California.

    PubMed

    Lewis, Jack; Rhodes, Jonathan J; Bradley, Curtis

    2018-04-11

    The Battle Creek watershed in northern California was historically important for its Chinook salmon populations, now at remnant levels due to land and water uses. Privately owned portions of the watershed are managed primarily for timber production, which has intensified since 1998, when clearcutting became widespread. Turbidity has been monitored by citizen volunteers at 13 locations in the watershed. Approximately 2000 grab samples were collected in the 5-year analysis period as harvesting progressed, a severe wildfire burned 11,200 ha, and most of the burned area was salvage logged. The data reveal strong associations of turbidity with the proportion of area harvested in watersheds draining to the measurement sites. Turbidity increased significantly over the measurement period in 10 watersheds and decreased at one. Some of these increases may be due to the influence of wildfire, logging roads and haul roads. However, turbidity continued trending upwards in six burned watersheds that were logged after the fire, while decreasing or remaining the same in two that escaped the fire and post-fire logging. Unusually high turbidity measurements (more than seven times the average value for a given flow condition) were very rare (0.0% of measurements) before the fire but began to appear in the first year after the fire (5.0% of measurements) and were most frequent (11.6% of measurements) in the first 9 months after salvage logging. Results suggest that harvesting contributes to road erosion and that current management practices do not fully protect water quality.

  15. SWIFT BAT Survey of AGN

    NASA Technical Reports Server (NTRS)

    Tueller, J.; Mushotzky, R. F.; Barthelmy, S.; Cannizzo, J. K.; Gehrels, N.; Markwardt, C. B.; Skinner, G. K.; Winter, L. M.

    2008-01-01

    We present the results1 of the analysis of the first 9 months of data of the Swift BAT survey of AGN in the 14-195 keV band. Using archival X-ray data or follow-up Swift XRT observations, we have identified 129 (103 AGN) of 130 objects detected at [b] > 15deg and with significance > 4.8-delta. One source remains unidentified. These same X-ray data have allowed measurement of the X-ray properties of the objects. We fit a power law to the logN - log S distribution, and find the slope to be 1.42+/-0.14. Characterizing the differential luminosity function data as a broken power law, we find a break luminosity logL*(ergs/s)= 43.85+/-0.26. We obtain a mean photon index 1.98 in the 14-195 keV band, with an rms spread of 0.27. Integration of our luminosity function gives a local volume density of AGN above 10(exp 41) erg/s of 2.4x10(exp -3) Mpc(sup -3), which is about 10% of the total luminous local galaxy density above M* = -19.75. We have obtained X-ray spectra from the literature and from Swift XRT follow-up observations. These show that the distribution of log nH is essentially flat from nH = 10(exp 20)/sq cm to 10(exp 24)/sq cm, with 50% of the objects having column densities of less than 10(exp 22)/sq cm. BAT Seyfert galaxies have a median redshift of 0.03, a maximum log luminosity of 45.1, and approximately half have log nH > 22.

  16. Determination of polydimethylsiloxane–water partition coefficients for ten 1-chloro-4-[2,2,2-trichloro-1-(4-chlorophenyl)ethyl]benzene-related compounds and twelve polychlorinated biphenyls using gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Eganhouse, Robert P.

    2016-01-01

    Polymer-water partition coefficients (Kpw) of ten DDT-related compounds were determined in pure water at 25 °C using commercial polydimethylsiloxane-coated optical fiber. Analyte concentrations were measured by thermal desorption-gas chromatography/full scan mass spectrometry (TD–GC/MSFS; fibers) and liquid injection-gas chromatography/selected ion monitoring mass spectrometry (LI–GC/MSSIM; water). Equilibrium was approached from two directions (fiber uptake and depletion) as a means of assessing data concordance. Measured compound-specific log Kpw values ranged from 4.8 to 6.1 with an average difference in log Kpw between the two approaches of 0.05 log units (∼12% of Kpw). Comparison of the experimentally-determined log Kpw values with previously published data confirmed the consistency of the results and the reliability of the method. A second experiment was conducted with the same ten DDT-related compounds and twelve selected PCB (polychlorinated biphenyl) congeners under conditions characteristic of a coastal marine field site (viz., seawater, 11 °C) that is currently under investigation for DDT and PCB contamination. Equilibration at lower temperature and higher ionic strength resulted in an increase in log Kpw for the DDT-related compounds of 0.28–0.49 log units (61–101% of Kpw), depending on the analyte. The increase in Kpw would have the effect of reducing by approximately half the calculated freely dissolved pore-water concentrations (Cfree). This demonstrates the importance of determining partition coefficients under conditions as they exist in the field.

  17. Hydrostatigraphic characterization of coastal aquifer by geophysical log analysis, Cape Cod National Seashore, Massachusetts

    USGS Publications Warehouse

    Morin, Roger H.; Urish, Daniel W.

    1995-01-01

    The Cape Cod National Seashore comprises part of Provincetown, Massachusetts, which lies at the northern tip of Cape Cod. The hydrologic regime in this area consists of unconsolidated sand-and-gravel deposits that constitute a highly permeable aquifer within which is a freshwater lens floating on denser sea water. A network of wells was installed into this aquifer to monitor a leachate plume emanating from the Provincetown landfill. Wells were located along orthogonal transects perpendicular to and parallel to the general groundwater flow path from the landfill to the seashore approximately 1,000 m to the southeast. Temperature, epithermal neutron, natural gamma. and electronmagnetic induction logs were obtained in five wells to depths ranging from 23 to 37 m. These logs identify the primary contamination and show that its movement is controlled by and confined within a dominant hydrostratigraphic unit about 2 to 5 m thick that exhibits low porosity, large representative grain size, and high relative permeability. A relation is also found between the temperaturegradient logs and water quality, with the gradient traces serving as effective delineators of the contaminant plume in wells nearest the landfill. Contamination is not detectable in the well nearest the seashore and farthest from the landfill, and the induction log from this well clearly identifies the freshwater/seawater transition zone at a depth of about 18 m. The geophysical logs provide fundamental information concerning the spatial distribution of aquifer properties near the landfill and lend valuable insight into how these properties influence the migration of the leachate plume to the sea.

  18. Ion-pair partition of quarternary ammonium drugs: the influence of counter ions of different lipophilicity, size, and flexibility.

    PubMed

    Takács-Novák, K; Szász, G

    1999-10-01

    The ion-pair partition of quaternary ammonium (QA) pharmacons with organic counter ions of different lipophilicity, size, shape and flexibility was studied to elucidate relationships between ion-pair formation and chemical structure. The apparent partition coefficient (P') of 4 QAs was measured in octanol/pH 7.4 phosphate buffer system by the shake-flask method as a function of molar excess of ten counter ions (Y), namely: mesylate (MES), acetate (AC), pyruvate (PYRU), nicotinate (NIC), hydrogenfumarate (HFUM), hydrogenmaleate (HMAL), p-toluenesulfonate (PTS), caproate (CPR), deoxycholate (DOC) and prostaglandin E1 anion (PGE1). Based on 118 of highly precise logP' values (SD< 0.05), the intrinsic lipophilicity (without external counter ions) and the ion-pair partition of QAs (with different counter ions) were characterized. Linear correlation was found between the logP' of ion-pairs and the size of the counter ions described by the solvent accessible surface area (SASA). The lipophilicity increasing effect of the counter ions were quantified and the following order was established: DOC approximate to PGE1 > CPR approximate to PTS > NIC approximate to HMAL > PYRU approximate to AC approximate to MES approximate to HFUM. Analyzing the lipophilicity/molar ratio (QA:Y) profile, the differences in the ion-pair formation were shown and attributed to the differences in the flexibility/rigidity and size both of QA and Y. Since the largest (in average, 300 X) lipophilicity enhancement was found by the influence of DOC and PGE1 and considerable (on average 40 X) increase was observed by CPR and PTS, it was concluded that bile acids and prostaglandin anions may play a significant role in the ion-pair transport of quaternary ammonium drugs and caproic acid and p-toluenesulfonic acid may be useful salt forming agents to improve the pharmacokinetics of hydrophilic drugs.

  19. Nut Production in Bertholletia excelsa across a Logged Forest Mosaic: Implications for Multiple Forest Use

    PubMed Central

    Rockwell, Cara A.; Guariguata, Manuel R.; Menton, Mary; Arroyo Quispe, Eriks; Quaedvlieg, Julia; Warren-Thomas, Eleanor; Fernandez Silva, Harol; Jurado Rojas, Edwin Eduardo; Kohagura Arrunátegui, José Andrés Hideki; Meza Vega, Luis Alberto; Revilla Vera, Olivia; Valera Tito, Jonatan Frank; Villarroel Panduro, Betxy Tabita; Yucra Salas, Juan José

    2015-01-01

    Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP) extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae) is one of the world’s most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity) on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013) in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80%) of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1–2 timber trees removed per ha). In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09) P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil nut rich forests, when a buffer zone cannot be observed, low logging intensities should be implemented. The sustainability of this integrated management system will ultimately depend on a complex series of socioeconomic and ecological interactions. Yet we submit that our study provides an important initial step in understanding the compatibility of timber harvesting with a high value NTFP, potentially allowing for diversification of forest use strategies in Amazonian Perù. PMID:26271042

  20. Effects of bilimbi (Averrhoa bilimbi L.) and tamarind (Tamarindus indica L.) juice on Listeria monocytogenes Scott A and Salmonella Typhimurium ATCC 14028 and the sensory properties of raw shrimps.

    PubMed

    Norhana, M N Wan; Azman, Mohd Nor A; Poole, Susan E; Deeth, Hilton C; Dykes, Gary A

    2009-11-30

    The potential of using juice of bilimbi (Averrhoa bilimbi L.) and tamarind (Tamarindus indica L.) to reduce Listeria monocytogenes Scott A and Salmonella Typhimurium ATCC 14028 populations on raw shrimps after washing and during storage (4 degrees C) was investigated. The uninoculated raw shrimps and those inoculated with approximately 9 log cfu/ml of L. monocytogenes Scott A and S. Typhimurium ATCC 14028 were washed (dipped or rubbed) in distilled water (SDW) (control), bilimbi or tamarind juice at 1:4 (w/v) concentrations for 10 and 5 min. Naturally occurring aerobic bacteria (APC), L. monocytogenes Scott A and S. Typhimurium ATCC 14028 counts, pH values and sensory analysis of washed shrimps were determined immediately after washing (day 0), and on days 3 and 7 of storage. Compared to SDW, bilimbi and tamarind juice significantly (p<0.05) reduced APC (0.40-0.70 log cfu/g), L. monocytogenes Scott A (0.84-1.58 log cfu/g) and S. Typhimurium ATCC 14028 (1.03-2.00 log cfu/g) populations immediately after washing (0 day). There was a significant difference (p<0.05) in bacterial reduction between the dipping (0.40-0.41 log for APC; 0.84 for L. monocytogenes Scott A and 1.03-1.09 log for S. Typhimurium ATCC 14028) and rubbing (0.68-0.70 log for APC; 1.34-1.58 for L. monocytogenes Scott A and 1.67-2.00 log for S. Typhimurium ATCC 14028) methods. Regardless of washing treatments or methods, populations of S. Typhimurium ATCC 14028 decreased slightly (5.10-6.29 log cfu/g on day 7 of storage) while populations of L. monocytogenes Scott A (8.74-9.20 log cfu/g) and APC (8.68-8.92 log cfu/g) increased significantly during refrigerated storage. The pH of experimental shrimps were significantly (p<0.05) decreased by 0.15-0.22 pH units after washing with bilimbi and tamarind juice. The control, bilimbi or tamarind-washed shrimps did not differ in sensory panellist acceptability (p>0.05) throughout the storage except for odour (p<0.05) attributes at 0 day when acidic or lemony smell was noticed in bilimbi- and tamarind-washed shrimps and not in control shrimps.

  1. Nut Production in Bertholletia excelsa across a Logged Forest Mosaic: Implications for Multiple Forest Use.

    PubMed

    Rockwell, Cara A; Guariguata, Manuel R; Menton, Mary; Arroyo Quispe, Eriks; Quaedvlieg, Julia; Warren-Thomas, Eleanor; Fernandez Silva, Harol; Jurado Rojas, Edwin Eduardo; Kohagura Arrunátegui, José Andrés Hideki; Meza Vega, Luis Alberto; Revilla Vera, Olivia; Quenta Hancco, Roger; Valera Tito, Jonatan Frank; Villarroel Panduro, Betxy Tabita; Yucra Salas, Juan José

    2015-01-01

    Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP) extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae) is one of the world's most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity) on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013) in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80%) of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1-2 timber trees removed per ha). In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09) P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil nut rich forests, when a buffer zone cannot be observed, low logging intensities should be implemented. The sustainability of this integrated management system will ultimately depend on a complex series of socioeconomic and ecological interactions. Yet we submit that our study provides an important initial step in understanding the compatibility of timber harvesting with a high value NTFP, potentially allowing for diversification of forest use strategies in Amazonian Perù.

  2. Adaptive estimation of the log fluctuating conductivity from tracer data at the Cape Cod Site

    USGS Publications Warehouse

    Deng, F.W.; Cushman, J.H.; Delleur, J.W.

    1993-01-01

    An adaptive estimation scheme is used to obtain the integral scale and variance of the log-fluctuating conductivity at the Cape Cod site based on the fast Fourier transform/stochastic model of Deng et al. (1993) and a Kalmanlike filter. The filter incorporates prior estimates of the unknown parameters with tracer moment data to adaptively obtain improved estimates as the tracer evolves. The results show that significant improvement in the prior estimates of the conductivity can lead to substantial improvement in the ability to predict plume movement. The structure of the covariance function of the log-fluctuating conductivity can be identified from the robustness of the estimation. Both the longitudinal and transverse spatial moment data are important to the estimation.

  3. Snake River Plain FORGE Well Data for INEL-1

    DOE Data Explorer

    Robert Podgorney

    1979-03-01

    Well data for the INEL-1 well located in eastern Snake River Plain, Idaho. This data collection includes caliper logs, lithology reports, borehole logs, temperature at depth data, neutron density and gamma data, full color logs, fracture analysis, photos, and rock strength parameters for the INEL-1 well. This collection of data has been assembled as part of the site characterization data used to develop the conceptual geologic model for the Snake River Plain site in Idaho, as part of phase 1 of the Frontier Observatory for Research in Geothermal Energy (FORGE) initiative. They were assembled by the Snake River Geothermal Consortium (SRGC), a team of collaborators that includes members from national laboratories, universities, industry, and federal agencies, lead by the Idaho National Laboratory (INL).

  4. Identifying high-risk small business industries for occupational safety and health interventions.

    PubMed

    Okun, A; Lentz, T J; Schulte, P; Stayner, L

    2001-03-01

    Approximately one-third (32%) of U.S. workers are employed in small business industries (those with 80% of workers in establishments with fewer than 100 employees), and approximately 53 million persons in private industry work in small business establishments. This study was performed to identify small business industries at high risk for occupational injuries, illnesses, and fatalities. Small business industries were identified from among all three- and four-digit Standard Industrial Classification (SIC) codes and ranked using Bureau of Labor Statistics (BLS) data by rates and numbers of occupational injuries, illnesses, and fatalities. Both incidence rates and number of injury, illness, and fatality cases were evaluated. The 253 small business industries identified accounted for 1,568 work-related fatalities (34% of all private industry). Transportation incidents and violent acts were the leading causes of these fatalities. Detailed injury and illness data were available for 105 small business industries, that accounted for 1,476,400 work-related injuries, and 55,850 occupational illnesses. Many of the small business industries had morbidity and mortality rates exceeding the average rates for all private industry. The highest risk small business industries, based on a combined morbidity and mortality index, included logging, cut stone and stone products, truck terminals, and roofing, siding, and sheet metal work. Identification of high-risk small business industries indicates priorities for those interested in developing targeted prevention programs.

  5. The effects of the catalytic converter and fuel sulfur level on motor vehicle particulate matter emissions: light duty diesel vehicles.

    PubMed

    Maricq, M Matti; Chase, Richard E; Xu, Ning; Laing, Paul M

    2002-01-15

    Wind tunnel measurements and direct tailpipe particulate matter (PM) sampling are utilized to examine how the combination of oxidation catalyst and fuel sulfur content affects the nature and quantity of PM emissions from the exhaust of a light duty diesel truck. When low sulfur fuel (4 ppm) is used, or when high sulfur (350 ppm)fuel is employed without an active catalyst present, a single log-normal distribution of exhaust particles is observed with a number mean diameter in the range of 70-83 nm. In the absence of the oxidation catalyst, the high sulfur level has at most a modest effect on particle emissions (<50%) and a minor effect on particle size (<5%). In combination with the active oxidation catalyst tested, high sulfur fuel can lead to a second, nanoparticle, mode, which appears at approximately 20 nm during high speed operation (70 mph), but is not present at low speed (40 mph). A thermodenuder significantly reduces the nanoparticle mode when set to temperatures above approximately 200 degrees C, suggesting that these particles are semivolatile in nature. Because they are observed only when the catalyst is present and the sulfur level is high, this mode likely originates from the nucleation of sulfates formed over the catalyst, although the composition may also include hydrocarbons.

  6. Spatial contrast sensitivity at twilight: luminance, monocularity, and oxygenation.

    PubMed

    Connolly, Desmond M

    2010-05-01

    Visual performance in dim light is compromised by lack of oxygen (hypoxia). The possible influence of altered oxygenation on foveal contrast sensitivity under mesopic (twilight) viewing conditions is relevant to aircrew flying at night, including when using night vision devices, but is poorly documented. Foveal contrast sensitivity was measured binocularly and monocularly in 12 subjects at 7 spatial frequencies, ranging from 0.5 to approximately 16 cycles per degree, using sinusoidal Gabor patch gratings. Hypoxic performance breathing 14.1% oxygen, equivalent to altitude exposure at 3048 m (10,000 ft), was compared with breathing air at sea level (normoxia) at low photopic (28 cd x m(-2)), borderline upper mesopic (approximately 2.1 cd x m(-2)) and midmesopic (approximately 0.26 cd x m(-2)) luminance. Mesopic performance was also assessed breathing 100% oxygen (hyperoxia). Typical 'inverted U' log/log plots of the contrast sensitivity function were obtained, with elevated thresholds (reduced sensitivity) at lower luminance. Binocular viewing enhanced sensitivity by a factor approximating square root of 2 for most conditions, supporting neural summation of the contrast signal, but had greater influence at the lowest light level and highest spatial frequencies (8.26 and 16.51 cpd). Respiratory challenges had no effect. Contrast sensitivity is poorer when viewing monocularly and especially at midmesopic luminance, with relevance to night flying. The foveal contrast sensitivity function is unaffected by respiratory disturbance when twilight conditions favor cone vision, despite known effects on retinal illumination (pupil size). The resilience of the contrast sensitivity function belies the vulnerability of foveal low contrast acuity to mild hypoxia at mesopic luminance.

  7. Consensual pupillary light response in the red-eared slider turtle (Trachemys scripta elegans).

    PubMed

    Dearworth, James R; Sipe, Grayson O; Cooper, Lori J; Brune, Erin E; Boyd, Angela L; Riegel, Rhae A L

    2010-03-17

    Purpose of this study was to determine if the turtle has a consensual pupillary light response (cPLR), and if so, to compare it to its direct pupillary light response (dPLR). One eye was illuminated with different intensities of light over a four log range while keeping the other eye in darkness. In the eye directly illuminated, pupil diameter was reduced by as much as approximately 31%. In the eye not stimulated by light, pupil diameter was also reduced but less to approximately 11%. When compared to the directly illuminated eye, this generated a ratio, cPLR-dPLR, equal to 0.35. Ratio of slopes for log/linear fits to plots of pupil changes versus retinal irradiance for non-illuminated (-1.27) to illuminated (-3.94) eyes closely matched at 0.32. cPLR had time constants ranging from 0.60 to 1.20min; however, they were comparable and not statistically different from those of the dPLR, which ranged from 1.41 to 2.00min. Application of mydriatic drugs to the directly illuminated eye also supported presence of a cPLR. Drugs reduced pupil constriction by approximately 9% for the dPLR and slowed its time constant to 9.58min while simultaneous enhancing constriction by approximately 6% for the cPLR. Time constant for the cPLR at 1.75min, however, was not changed. Results support that turtle possesses a cPLR although less strong than its dPLR. Copyright 2010 Elsevier Ltd. All rights reserved.

  8. Mobilized plasma lead as an index of lead body burden and its relation to the heme-related indices.

    PubMed

    Sakai, T; Ushio, K; Ikeya, Y

    1998-07-01

    Plasma lead (Pb-P) from workers were distributed in two main fractions: a protein bound fraction and low molecular weight fractions. Lead mobilized into plasma by CaEDTA was mainly observed in the low molecular weight fraction corresponding to lead disodium ethylenediamine tetraacetic acid (PbEDTA). The peak levels of Pb-P was attained around 1.5 and 2.5 hours after the start of CaEDTA injection. Pb-P and blood lead levels (Pb-B) at 2 h after the injection were 4.26 (+/- 2.84) and 0.96 (+/- 0.27) fold of the initial levels just before the injection. Pb-P concentrations at 2 hours after the start of CaEDTA injection (MPb-P) were well correlated (r = 0.740) with amounts of lead excreted in urine for 24 h thereafter (MPb-U). log MPb-P as well as log MPb-U were correlated with Pb-B (r = 0.765 and 0.817, respectively). Correlation coefficients of lead body burden (MPb-P or MPb-U) vs the logarithms of the effect indices (delta-aminolevulinic acid (ALA) dehydratase, ALA in urine, coproporphyrin in urine, and erythrocyte zinc protoporphyrin) were higher than the correlation coefficients of exposure indices (Pb-B or Pb-U) vs the logarithms of the effect indices. Thus the biological effect monitoring is significant and reliable for evaluating the functional components of lead body burden (MPb-P or MPb-U).

  9. Relationship between the lipophilicity of gallic acid n-alquil esters' derivatives and both myeloperoxidase activity and HOCl scavenging.

    PubMed

    Rosso, Rober; Vieira, Tiago O; Leal, Paulo C; Nunes, Ricardo J; Yunes, Rosendo A; Creczynski-Pasa, Tânia B

    2006-09-15

    The gallic acid and several n-alkyl gallates, with the same number of hydroxyl substituents, varying only in the side carbonic chain length, with respective lipophilicity defined through the C log P, were studied. It evidenced the structure-activity relationship of the myeloperoxidase activity inhibition and the hypochlorous acid scavenger property, as well as its low toxicity in rat hepatic tissue. The gallates with C log P below 3.0 (compounds 2-7) were more active against the enzyme activity, what means that the addition of 1-6 carbons (C log P between 0.92 and 2.92) at the side chain increased approximately 50% the gallic acid effect. However, a relationship between the HOCl scavenging capability and the lipophilicity was not observed. With these results it is possible to suggest that the gallates protect the HOCl targets through two mechanisms: inhibiting its production by the enzyme and scavenging the reactive specie.

  10. Contribution of waste water treatment plants to pesticide toxicity in agriculture catchments.

    PubMed

    Le, Trong Dieu Hien; Scharmüller, Andreas; Kattwinkel, Mira; Kühne, Ralph; Schüürmann, Gerrit; Schäfer, Ralf B

    2017-11-01

    Pesticide residues are frequently found in water bodies and may threaten freshwater ecosystems and biodiversity. In addition to runoff or leaching from treated agricultural fields, pesticides may enter streams via effluents from wastewater treatment plants (WWTPs). We compared the pesticide toxicity in terms of log maximum Toxic Unit (log mTU) of sampling sites in small agricultural streams of Germany with and without WWTPs in the upstream catchments. We found an approximately half log unit higher pesticide toxicity for sampling sites with WWTPs (p < 0.001). Compared to fungicides and insecticides, herbicides contributed most to the total pesticide toxicity in streams with WWTPs. A few compounds (diuron, terbuthylazin, isoproturon, terbutryn and Metazachlor) dominated the herbicide toxicity. Pesticide toxicity was not correlated with upstream distance to WWTP (Spearman's rank correlation, rho = - 0.11, p > 0.05) suggesting that other context variables are more important to explain WWTP-driven pesticide toxicity. Our results suggest that WWTPs contribute to pesticide toxicity in German streams. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Efficacy of chlorine and calcinated calcium treatment of alfalfa seeds and sprouts to eliminate Salmonella.

    PubMed

    Gandhi, Megha; Matthews, Karl R

    2003-11-01

    The efficacy of a 20,000 ppm calcium hypochlorite treatment of alfalfa seeds artificially contaminated with Salmonella was studied. Salmonella populations reached >7.0 log on sprouts grown from seeds artificially contaminated with Salmonella and then treated with 20,000 ppm Ca(OCl)(2). The efficacy of spray application of chlorine (100 ppm) to eliminate Salmonella during germination and growth of alfalfa was assessed. Alfalfa seed artificially contaminated with Salmonella was treated at germination, on day 2 or day 4, or for the duration of the growth period. Spray application of 100 ppm chlorine at germination, day 2, or day 4 of growth was minimally effective resulting in approximately a 0.5-log decrease in population of Salmonella. Treatment on each of the 4 days of growth reduced populations of Salmonella by only 1.5 log. Combined treatment of seeds with 20,000 ppm Ca(OCl)(2) and followed by 100 ppm chlorine or calcinated calcium during germination and sprout growth did not eliminate Salmonella.

  12. Unsplittable Flow in Paths and Trees and Column-Restricted Packing Integer Programs

    NASA Astrophysics Data System (ADS)

    Chekuri, Chandra; Ene, Alina; Korula, Nitish

    We consider the unsplittable flow problem (UFP) and the closely related column-restricted packing integer programs (CPIPs). In UFP we are given an edge-capacitated graph G = (V,E) and k request pairs R 1, ..., R k , where each R i consists of a source-destination pair (s i ,t i ), a demand d i and a weight w i . The goal is to find a maximum weight subset of requests that can be routed unsplittably in G. Most previous work on UFP has focused on the no-bottleneck case in which the maximum demand of the requests is at most the smallest edge capacity. Inspired by the recent work of Bansal et al. [3] on UFP on a path without the above assumption, we consider UFP on paths as well as trees. We give a simple O(logn) approximation for UFP on trees when all weights are identical; this yields an O(log2 n) approximation for the weighted case. These are the first non-trivial approximations for UFP on trees. We develop an LP relaxation for UFP on paths that has an integrality gap of O(log2 n); previously there was no relaxation with o(n) gap. We also consider UFP in general graphs and CPIPs without the no-bottleneck assumption and obtain new and useful results.

  13. Time-dependent disk accretion in X-ray Nova MUSCAE 1991

    NASA Astrophysics Data System (ADS)

    Mineshige, Shin; Hirano, Akira; Kitamoto, Shunji; Yamada, Tatsuya T.; Fukue, Jun

    1994-05-01

    We propose a new model for X-ray spectral fitting of binary black hole candidates. In this model, it is assumed that X-ray spectra are composed of a Comptonized blackbody (hard component) and a disk blackbody spectra (soft component), in which the temperature gradient of the disk, q identically equal to -d log T/d log r, is left as a fitting parameter. With this model, we have fitted X-ray spectra of X-ray Nova Muscae 1991 obtained by Ginga. The fitting shows that a hot cloud, which Compton up-scatters soft photons from the disk, gradually shrank and became transparent after the main peak. The temperature gradient turns out to be fairly constant and is q approximately 0.75, the value expected for a Newtonian disk model. To reproduce this value with a relativistic disk model, a small inclination angle, i approximately equal to 0 deg to 15 deg, is required. It seems, however, that the q-value temporarily decreased below 0.75 at the main flare, and q increased in a transient fashion at the second peak (or the reflare) occurring approximately 70 days after the main peak. Although statistics are poor, these results, if real, would indicate that the disk brightening responsible for the main and secondary peaks are initiated in the relatively inner portions of the disk.

  14. Effective field theory approach to heavy quark fragmentation

    DOE PAGES

    Fickinger, Michael; Fleming, Sean; Kim, Chul; ...

    2016-11-17

    Using an approach based on Soft Collinear Effective Theory (SCET) and Heavy Quark Effective Theory (HQET) we determine the b-quark fragmentation function from electron-positron annihilation data at the Z-boson peak at next-to-next-to leading order with next-to-next-to leading log resummation of DGLAP logarithms, and next-to-next-to-next-to leading log resummation of endpoint logarithms. This analysis improves, by one order, the previous extraction of the b-quark fragmentation function. We find that while the addition of the next order in the calculation does not much shift the extracted form of the fragmentation function, it does reduce theoretical errors indicating that the expansion is converging. Usingmore » an approach based on effective field theory allows us to systematically control theoretical errors. Furthermore, while the fits of theory to data are generally good, the fits seem to be hinting that higher order correction from HQET may be needed to explain the b-quark fragmentation function at smaller values of momentum fraction.« less

  15. Outcomes of Anti-Vascular Endothelial Growth Factor Treatment for Choroidal Neovascularization in Fellow Eyes of Previously Treated Patients With Neovascular Age-Related Macular Degeneration.

    PubMed

    Stem, Maxwell S; Moinuddin, Omar; Kline, Noah; Thanos, Aristomenis; Rao, Prethy; Williams, George A; Hassan, Tarek S

    2018-05-10

    Neovascular age-related macular degeneration (nvAMD) is a leading cause of vision loss. The optimal screening protocol to detect choroidal neovascularization (CNV) in fellow eyes of patients undergoing treatment for unilateral CNV has not been determined. To compare the visual outcomes of eyes with established, active nvAMD in index eyes with outcomes of fellow eyes that subsequently developed CNV during the management protocol. In this retrospective single-center case series conducted at a private vitreoretinal practice, data were collected for all patients treated for bilateral nvAMD between October 1, 2015, and October 1, 2016, for whom we could determine the date of index eye and fellow eye conversion to nvAMD (n = 1600). Per institutional protocol, patients were screened for new CNV in the fellow eye at every office visit. Patients were excluded if they had a condition that could result in marked asymmetric vision loss. Development of nvAMD. Visual acuity (VA) at the time of diagnosis of nvAMD and at equivalent time points following conversion to nvAMD for both index eyes and fellow eyes. A total of 264 patients met the inclusion criteria; 197 (74.6%) were women and 253 (95.8%) were white, and the mean (SD) age was 79.1 (8.2) years at time of index eye conversion to nvAMD and 80.6 (8.2) years at time of fellow eye conversion to nvAMD. Fellow eyes presented with better VA (mean VA, 20/50 [0.40 logMAR]) compared with index eyes (mean VA, 20/90 [0.67 logMAR]) at the time of conversion (difference, 14 letters [0.27 logMAR]; 95% CI, 10-17 [0.20-0.34]; P < .001). Index eyes did not achieve the same level of VA as fellow eyes after an equivalent postconversion follow-up of approximately 20 months (mean VA: index eye; 20/70 [0.56 logMAR]; fellow eye, 20/50 [0.40 logMAR]; difference, 8 letters [0.15 logMAR]; 95% CI, 4-11 [0.08-0.22]; P < .001). No difference was detected between the mean number of anti-vascular endothelial growth factor injections received by fellow eyes and index eyes (9.7 vs 10.0 injections, respectively). This retrospective study suggests that fellow eyes of previously treated patients with nvAMD may achieve better VA than their index eye counterparts after an equivalent amount of follow-up. This may be because the CNV was detected and treated earlier and at a better level of VA, although it is unknown whether the frequent office visits, VA measurements, or optical coherence tomography testing was responsible for the detection at a better level of VA.

  16. Liquid egg white pasteurization using a centrifugal UV irradiator.

    PubMed

    Geveke, David J; Torres, Daniel

    2013-03-01

    Studies are limited on UV nonthermal pasteurization of liquid egg white (LEW). The objective of this study was to inactivate Escherichia coli using a UV irradiator that centrifugally formed a thin film of LEW on the inside of a rotating cylinder. The LEW was inoculated with E. coli K12 to approximately 8 log cfu/ml and was processed at the following conditions: UV intensity 1.5 to 9.0 mW/cm²; cylinder rotational speed 450 to 750 RPM, cylinder inclination angle 15° to 45°, and flow rate 300 to 900 ml/min, and treatment time 1.1 to 3.2s. Appropriate dilutions of the samples were pourplated with tryptic soy agar (TSA). Sublethal injury was determined using TSA+4% NaCl. The regrowth of surviving E. coli during refrigerated storage for 28 days was investigated. The electrical energy of the UV process was also determined. The results demonstrated that UV processing of LEW at a dose of 29 mJ/cm² at 10°C reduced E. coli by 5 log cfu/ml. Inactivation significantly increased with increasing UV dose and decreasing flow rate. The results at cylinder inclination angles of 30° and 45° were similar and were significantly better than those at 15°. The cylinder rotational speed had no significant effect on inactivation. The occurrence of sublethal injury was detected. Storage of UV processed LEW at 4° and 10°C for 21 days further reduced the population of E. coli to approximately 1 log cfu/ml where it remained for an additional 7 days. The UV energy applied to the LEW to obtain a 5 log reduction of E. coli was 3.9 J/ml. These results suggest that LEW may be efficiently pasteurized, albeit at low flow rates, using a nonthermal UV device that centrifugally forms a thin film. Published by Elsevier B.V.

  17. Effects of pomegranate and pomegranate-apple blend juices on the growth characteristics of Alicyclobacillus acidoterrestris DSM 3922 type strain vegetative cells and spores.

    PubMed

    Molva, Celenk; Baysal, Ayse Handan

    2015-05-04

    The present study examined the growth characteristics of Alicyclobacillus acidoterrestris DSM 3922 vegetative cells and spores after inoculation into apple, pomegranate and pomegranate-apple blend juices (10, 20, 40 and 80%, v/v). Also, the effect of sporulation medium was tested using mineral [Bacillus acidoterrestris agar (BATA) and Bacillus acidocaldarius agar (BAA)] and non-mineral containing media [potato dextrose agar (PDA) and malt extract agar (MEA)]. The juice samples were inoculated separately with approximately 10(5)CFU/mL cells or spores from different sporulation media and then incubated at 37°C for 336 h. The number of cells decreased significantly with increasing pomegranate juice concentration in the blend juices and storage time (p<0.001). Based on the results, 3.17, 3.53, and 3.72 log cell reductions were observed in 40%, 80% blend and pomegranate juices, respectively while the cell counts attained approximately 7.17 log CFU/mL in apple juice after 336 h. On the other hand, the cell growth was inhibited for a certain time, and then the numbers started to increase after 72 and 144 h in 10% and 20% blend juices, respectively. After 336 h, total population among spores produced on PDA, BATA, BAA and MEA indicated 1.49, 1.65, 1.67, and 1.28 log reductions in pomegranate juice; and 1.51, 1.38, 1.40 and 1.16 log reductions in 80% blend juice, respectively. The inhibitory effects of 10%, 20% and 40% blend juices varied depending on the sporulation media used. The results obtained in this study suggested that pomegranate and pomegranate-apple blend juices could inhibit the growth of A. acidoterrestris DSM 3922 vegetative cells and spores. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. The effects on the microbiological condition of product of carcass dressing, cooling, and portioning processes at a poultry packing plant.

    PubMed

    Gill, C O; Moza, L F; Badoni, M; Barbut, S

    2006-07-15

    The log mean numbers of aerobes, coliforms, Escherichia coli and presumptive staphylococci plus listerias on chicken carcasses and carcass portions at various stages of processing at a poultry packing plant were estimated from the numbers of those bacteria recovered from groups of 25 randomly selected product units. The fractions of listerias in the presumptive staphylococci plus listerias groups of organisms were also estimated. Samples were obtained from carcasses by excising a strip of skin measuring approximately 5 x 2 cm(2) from a randomly selected site on each selected carcass, or by rinsing each selected carcass portion. The log mean numbers of aerobes, coliforms, E. coli and presumptive staphylococci plus listerias on carcasses after scalding at 58 degrees C and plucking were about 4.4, 2.5, 2.2 and 1.4 log cfu/cm(2), respectively. The numbers of bacteria on eviscerated carcasses were similar. After the series of operations for removing the crop, lungs, kidneys and neck, the numbers of aerobes were about 1 log unit less than on eviscerated carcasses, but the numbers of the other bacteria were not substantially reduced. After cooling in water, the numbers of coliforms and E. coli were about 1 log unit less and the numbers of presumptive staphylococci plus listerias were about 0.5 log unit less than the numbers on dressed carcasses, but the numbers of aerobes were not reduced. The numbers of aerobes were 1 log unit more on boneless breasts, and 0.5 log units more on skin-on thighs and breasts that had been tumbled with brine than on cooled carcasses; and presumptive staphylococci plus listerias were 0.5 log unit more on thighs than on cooled carcasses. Otherwise the numbers of bacteria on the product were not substantially affected by processing. Listerias were <20% of the presumptive staphylococci plus listerias group of organisms recovered from product at each point in the process except after breasts were tumbled with brine, when >40% of the organisms were listerias.

  19. Identifying, counting, and characterizing superfine activated-carbon particles remaining after coagulation, sedimentation, and sand filtration.

    PubMed

    Nakazawa, Yoshifumi; Matsui, Yoshihiko; Hanamura, Yusuke; Shinno, Koki; Shirasaki, Nobutaka; Matsushita, Taku

    2018-07-01

    Superfine powdered activated carbon (SPAC; particle diameter ∼1 μm) has greater adsorptivity for organic molecules than conventionally sized powdered activated carbon (PAC). Although SPAC is currently used in the pretreatment to membrane filtration at drinking water purification plants, it is not used in conventional water treatment consisting of coagulation-flocculation, sedimentation, and rapid sand filtration (CSF), because it is unclear whether CSF can adequately remove SPAC from the water. In this study, we therefore investigated the residual SPAC particles in water after CSF treatment. First, we developed a method to detect and quantify trace concentration of carbon particles in the sand filtrate. This method consisted of 1) sampling particles with a membrane filter and then 2) using image analysis software to manipulate a photomicrograph of the filter so that black spots with a diameter >0.2 μm (considered to be carbon particles) could be visualized. Use of this method revealed that CSF removed a very high percentage of SPAC: approximately 5-log in terms of particle number concentrations and approximately 6-log in terms of particle volume concentrations. When waters containing 7.5-mg/L SPAC and 30-mg/L PAC, concentrations that achieved the same adsorption performance, were treated, the removal rate of SPAC was somewhat superior to that of PAC, and the residual particle number concentrations for SPAC and PAC were at the same low level (100-200 particles/mL). Together, these results suggest that SPAC can be used in place of PAC in CSF treatment without compromising the quality of the filtered water in terms of particulate matter contamination. However, it should be noted that the activated carbon particles after sand filtration were smaller in terms of particle size and were charge-neutralized to a lesser extent than the activated carbon particles before sand filtration. Therefore, the tendency of small particles to escape in the filtrate would appear to be related to the fact that their small size leads to a low destabilization rate during the coagulation process and a low collision rate during the flocculation and filtration processes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution.

    PubMed

    Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn

    2013-03-06

    Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.

  1. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution

    PubMed Central

    2013-01-01

    Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171

  2. Evaluation of the effects of ultraviolet light on bacterial contaminants inoculated into whole milk and colostrum, and on colostrum immunoglobulin G.

    PubMed

    Pereira, R V; Bicalho, M L; Machado, V S; Lima, S; Teixeira, A G; Warnick, L D; Bicalho, R C

    2014-05-01

    Raw milk and colostrum can harbor dangerous microorganisms that can pose serious health risks for animals and humans. According to the USDA, more than 58% of calves in the United States are fed unpasteurized milk. The aim of this study was to evaluate the effect of UV light on reduction of bacteria in milk and colostrum, and on colostrum IgG. A pilot-scale UV light continuous (UVC) flow-through unit (45 J/cm(2)) was used to treat milk and colostrum. Colostrum and sterile whole milk were inoculated with Listeria innocua, Mycobacterium smegmatis, Salmonella serovar Typhimurium, Escherichia coli, Staphylococcus aureus, Streptococcus agalactiae, and Acinetobacter baumannii before being treated with UVC. During UVC treatment, samples were collected at 5 time points and bacteria were enumerated using selective media. The effect of UVC on IgG was evaluated using raw colostrum from a nearby dairy farm without the addition of bacteria. For each colostrum batch, samples were collected at several different time points and IgG was measured using ELISA. The UVC treatment of milk resulted in a significant final count (log cfu/mL) reduction of Listeria monocytogenes (3.2 ± 0.3 log cfu/mL reduction), Salmonella spp. (3.7 ± 0.2 log cfu/mL reduction), Escherichia coli (2.8 ± 0.2 log cfu/mL reduction), Staph. aureus (3.4 ± 0.3 log cfu/mL reduction), Streptococcus spp. (3.4 ± 0.4 log cfu/mL reduction), and A. baumannii (2.8 ± 0.2 log cfu/mL reduction). The UVC treatment of milk did not result in a significant final count (log cfu/mL) reduction for M. smegmatis (1.8 ± 0.5 log cfu/mL reduction). The UVC treatment of colostrum was significantly associated with a final reduction of bacterial count (log cfu/mL) of Listeria spp. (1.4 ± 0.3 log cfu/mL reduction), Salmonella spp. (1.0 ± 0.2 log cfu/mL reduction), and Acinetobacter spp. (1.1 ± 0.3 log cfu/mL reduction), but not of E. coli (0.5 ± 0.3 log cfu/mL reduction), Strep. agalactiae (0.8 ± 0.2 log cfu/mL reduction), and Staph. aureus (0.4 ± 0.2 log cfu/mL reduction). The UVC treatment of colostrum significantly decreased the IgG concentration, with an observed final mean IgG reduction of approximately 50%. Development of new methods to reduce bacterial contaminants in colostrum must take into consideration the barriers imposed by its opacity and organic components, and account for the incidental damage to IgG caused by manipulating colostrum. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  3. 78 FR 35357 - Bridgestone Americas Tire Operations, LLC, Grant of Petition for Decision of Inconsequential...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-12

    ... the petition and all supporting documents log onto the Federal Docket Management System (FDMS) Web... approximately 467 Firestone brand Transforce AT, size LT265/70R17, light truck replacement tires manufactured... participants had knowledge of tire labeling beyond the tire brand name, tire size, and tire pressure. Since...

  4. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  5. Bridging the gap between ecosystem theory and forest watershed management

    Treesearch

    Jackson Webster; Wayne Swank; James Vose; Jennifer Knoepp; Katherine Elliott

    2014-01-01

    The history of forests and logging in North America provides a back drop for our study of Watershed (WS) 7. Prior to European settlement, potentially commercial forests covered approximately 45% of North America, but not all of it was the pristine, ancient forest that some have imagined. Prior to 1492, Native Americans had extensive settlements throughout eastern...

  6. Performance of Low-Density Parity-Check Coded Modulation

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  7. Constructing compact and effective graphs for recommender systems via node and edge aggregations

    DOE PAGES

    Lee, Sangkeun; Kahng, Minsuk; Lee, Sang-goo

    2014-12-10

    Exploiting graphs for recommender systems has great potential to flexibly incorporate heterogeneous information for producing better recommendation results. As our baseline approach, we first introduce a naive graph-based recommendation method, which operates with a heterogeneous log-metadata graph constructed from user log and content metadata databases. Although the na ve graph-based recommendation method is simple, it allows us to take advantages of heterogeneous information and shows promising flexibility and recommendation accuracy. However, it often leads to extensive processing time due to the sheer size of the graphs constructed from entire user log and content metadata databases. In this paper, we proposemore » node and edge aggregation approaches to constructing compact and e ective graphs called Factor-Item bipartite graphs by aggregating nodes and edges of a log-metadata graph. Furthermore, experimental results using real world datasets indicate that our approach can significantly reduce the size of graphs exploited for recommender systems without sacrificing the recommendation quality.« less

  8. An artificial intelligence approach to lithostratigraphic correlation using geophysical well logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olea, R.A.; Davis, J.C.

    1986-01-01

    Computer programs for lithostratigraphic correlation of well logs have achieved limited success. Their algorithms are based on an oversimplified view of the manual process used by analysts to establish geologically correct correlations. The programs experience difficulties if the correlated rocks deviate from an ideal geometry of perfectly homogeneous, parallel layers of infinite extent. Artificial intelligence provides a conceptual basis for formulating the task of lithostratigraphic correlation, leading to more realistic procedures. A prototype system using the ''production rule'' approach of expert systems successfully correlates well logs in areas of stratigraphic complexity. Two digitized logs are used per well, one formore » curve matching and the other for lithologic comparison. The software has been successfully used to correlate more than 100,000 ft (30 480 m) of section, through clastic sequences in Louisiana and through carbonate sequences in Kansas. Correlations have been achieved even in the presence of faults, unconformities, facies changes, and lateral variations in bed thickness.« less

  9. Development of Enabling Scientific Tools to Characterize the Geologic Subsurface at Hanford

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenna, Timothy C.; Herron, Michael M.

    2014-07-08

    This final report to the Department of Energy provides a summary of activities conducted under our exploratory grant, funded through U.S. DOE Subsurface Biogeochemical Research Program in the category of enabling scientific tools, which covers the period from July 15, 2010 to July 14, 2013. The main goal of this exploratory project is to determine the parameters necessary to translate existing borehole log data into reservoir properties following scientifically sound petrophysical relationships. For this study, we focused on samples and Ge-based spectral gamma logging system (SGLS) data collected from wells located in the Hanford 300 Area. The main activities consistedmore » of 1) the analysis of available core samples for a variety of mineralogical, chemical and physical; 2) evaluation of selected spectral gamma logs, environmental corrections, and calibration; 3) development of algorithms and a proposed workflow that permits translation of log responses into useful reservoir properties such as lithology, matrix density, porosity, and permeability. These techniques have been successfully employed in the petroleum industry; however, the approach is relatively new when applied to subsurface remediation. This exploratory project has been successful in meeting its stated objectives. We have demonstrated that our approach can lead to an improved interpretation of existing well log data. The algorithms we developed can utilize available log data, in particular gamma, and spectral gamma logs, and continued optimization will improve their application to ERSP goals of understanding subsurface properties.« less

  10. Bin Packing, Number Balancing, and Rescaling Linear Programs

    NASA Astrophysics Data System (ADS)

    Hoberg, Rebecca

    This thesis deals with several important algorithmic questions using techniques from diverse areas including discrepancy theory, machine learning and lattice theory. In Chapter 2, we construct an improved approximation algorithm for a classical NP-complete problem, the bin packing problem. In this problem, the goal is to pack items of sizes si ∈ [0,1] into as few bins as possible, where a set of items fits into a bin provided the sum of the item sizes is at most one. We give a polynomial-time rounding scheme for a standard linear programming relaxation of the problem, yielding a packing that uses at most OPT + O(log OPT) bins. This makes progress towards one of the "10 open problems in approximation algorithms" stated in the book of Shmoys and Williamson. In fact, based on related combinatorial lower bounds, Rothvoss conjectures that theta(logOPT) may be a tight bound on the additive integrality gap of this LP relaxation. In Chapter 3, we give a new polynomial-time algorithm for linear programming. Our algorithm is based on the multiplicative weights update (MWU) method, which is a general framework that is currently of great interest in theoretical computer science. An algorithm for linear programming based on MWU was known previously, but was not polynomial time--we remedy this by alternating between a MWU phase and a rescaling phase. The rescaling methods we introduce improve upon previous methods by reducing the number of iterations needed until one can rescale, and they can be used for any algorithm with a similar rescaling structure. Finally, we note that the MWU phase of the algorithm has a simple interpretation as gradient descent of a particular potential function, and we show we can speed up this phase by walking in a direction that decreases both the potential function and its gradient. In Chapter 4, we show that an approximate oracle for Minkowski's Theorem gives an approximate oracle for the number balancing problem, and conversely. Number balancing is the problem of minimizing | 〈a,x〉 | over x ∈ {-1,0,1}n \\ { 0}, given a ∈ [0,1]n. While an application of the pigeonhole principle shows that there always exists x with | 〈a,x〉| ≤ O(√ n/2n), the best known algorithm only guarantees |〈a,x〉| ≤ 2-ntheta(log n). We show that an oracle for Minkowski's Theorem with approximation factor rho would give an algorithm for NBP that guarantees | 〈a,x〉 | ≤ 2-ntheta(1/rho). In particular, this would beat the bound of Karmarkar and Karp provided rho ≤ O(logn/loglogn). In the other direction, we prove that any polynomial time algorithm for NBP that guarantees a solution of difference at most 2√n/2 n would give a polynomial approximation for Minkowski as well as a polynomial factor approximation algorithm for the Shortest Vector Problem.

  11. Effectiveness of lemon juice in the elimination of Salmonella Typhimurium in stuffed mussels.

    PubMed

    Kişla, Duygu

    2007-12-01

    Street foods are becoming more and more prominent in countries all over the world. There are many reports of disease due to consumption of street foods contaminated by pathogens. With the modern trend toward more natural preservatives, the use of organic acids can achieve a good microbiological safety in food. In the present study, stuffed mussels were inoculated with Salmonella Typhimurium suspension to provide initial populations of approximately 6 and 3 log CFU/g. After inoculation, samples were treated with fresh lemon juice and lemon dressing for 0, 5, and 15 min, and pathogens were enumerated by using direct plating on brilliant green agar. Treatment of stuffed mussels inoculated at high inoculum level, with lemon juice and lemon dressing for different exposure times caused reduction ranging between 0.25 and 0.56 log CFU/g and 0.5 and 0.69 log CFU/g, respectively, whereas in stuffed-mussel samples inoculated at low level, lemon juice and lemon dressing caused 0.08 to 0.25 log CFU/g and 0.22 to 0.78 log CFU/g reductions, respectively. Results of the study showed that both lemon juice and lemon dressing used as flavoring and acidifying agents for stuffed mussels caused slight decrease in Salmonella Typhimurium as an immediate inhibitor, but this effect increased by time. However, treatment of stuffed mussels with the inhibitors until 15 min is not enough to prevent Salmonella Typhimurium outbreaks related to stuffed mussels.

  12. Results of well-bore flow logging for six water-production wells completed in the Santa Fe Group aquifer system, Albuquerque, New Mexico, 1996-98

    USGS Publications Warehouse

    Thorn, Conde R.

    2000-01-01

    Over the last several years, an improved conceptual understanding of the aquifer system in the Albuquerque area, New Mexico, has lead to better knowledge about the location and extent of the aquifer system. This information will aid with the refinement of ground-water simulation and with the location of sites for future water-production wells. With an impeller-type flowmeter, well-bore flow was logged under pumping conditions along the screened interval of the well bore in six City of Albuquerque water-production wells: the Ponderosa 3, Love 6, Volcano Cliffs 1, Gonzales 2, Zamora 2, and Gonzales 3 wells. From each of these six wells, a well-bore flow log was collected that represents the cumulative upward well-bore flow. Evaluation of the well-bore flow log for each well allowed delineation of the more productive zones supplying water to the well along the logged interval. Yields from the more productive zones in the six wells ranged from about 70 to 880 gallons per minute. The lithology of these zones is predominantly gravel and sand with varying amounts of sandy clay.

  13. Inactivation of Escherichia coli O157:H7, Salmonella typhimurium DT104, and Listeria monocytogenes on inoculated alfalfa seeds with a fatty acid-based sanitizer.

    PubMed

    Pierre, Pascale M; Ryser, Elliot T

    2006-03-01

    Alfalfa seeds were inoculated with a three-strain cocktail of Escherichia coli O157:H7, Salmonella enterica subsp. enterica serovar Typhimurium DT104, or Listeria monocytogenes by immersion to contain approximately 6 to 8 log CFU/g and then treated with a fatty acid-based sanitizer containing 250 ppm of peroxyacid, 1,000 ppm of caprylic and capric acids (Emery 658), 1,000 ppm of lactic acid, and 500 ppm of glycerol monolaurate at a reference concentration of 1X. Inoculated seeds were immersed at sanitizer concentrations of 5X, 10X, and 15X for 1, 3, 5, and 10 min and then assessed for pathogen survivors by direct plating. The lowest concentration that decreased all three pathogens by >5 log was 15. After a 3-min exposure to the 15X concentration, populations of E. coli O157:H7, Salmonella Typhimurium DT104, and L. monocytogenes decreased by >5.45, >5.62, and >6.92 log, respectively, with no sublethal injury and no significant loss in seed germination rate or final sprout yield. The components of this 15x concentration (treatment A) were assessed independently and in various combinations to optimize antimicrobial activity. With inoculated seeds, treatment C (15,000 ppm of Emery 658, 15,000 ppm of lactic acid, and 7,500 ppm of glycerol monolaurate) decreased Salmonella Typhimurium, E. coli O157:H7, and L. monocytogenes by 6.23 and 5.57 log, 4.77 and 6.29 log, and 3.86 and 4.21 log after 3 and 5 min of exposure, respectively. Treatment D (15,000 ppm of Emery 658 and 15,000 ppm of lactic acid) reduced Salmonella Typhimurium by >6.90 log regardless of exposure time and E. coli )157:H7 and L. monocytogenes by 4.60 and >5.18 log and 3.55 and 3.14 log after 3 and 5 min, respectively. No significant differences (P > 0.05) were found between treatments A, C, and D. Overall, treatment D, which contained Emery 658 and lactic acid as active ingredients, reduced E. coli O157:H7, Salmonella Typhimurium, and L. monocytogenes populations by 3.55 to >6.90 log and may provide a viable alternative to the recommended 20,000 ppm of chlorine for sanitizing alfalfa seeds.

  14. A scale-invariant keypoint detector in log-polar space

    NASA Astrophysics Data System (ADS)

    Tao, Tao; Zhang, Yun

    2017-02-01

    The scale-invariant feature transform (SIFT) algorithm is devised to detect keypoints via the difference of Gaussian (DoG) images. However, the DoG data lacks the high-frequency information, which can lead to a performance drop of the algorithm. To address this issue, this paper proposes a novel log-polar feature detector (LPFD) to detect scale-invariant blubs (keypoints) in log-polar space, which, in contrast, can retain all the image information. The algorithm consists of three components, viz. keypoint detection, descriptor extraction and descriptor matching. Besides, the algorithm is evaluated in detecting keypoints from the INRIA dataset by comparing with the SIFT algorithm and one of its fast versions, the speed up robust features (SURF) algorithm in terms of three performance measures, viz. correspondences, repeatability, correct matches and matching score.

  15. Lithium in giant stars in NGC 752 and M67

    NASA Astrophysics Data System (ADS)

    Pilachowski, Catherine; Saha, A.; Hobbs, L. M.

    1988-04-01

    Spectra of giant stars in the intermediate-age galactic cluster NGC 752 and in the old cluster M67 have been examined for the presence of Li I λ6707. The lithium feature is not present in any of the M67 giants observed, leading to upper-limit abundances of log ɛ(Li) ≤ -1.0 to 0.3. While lithium is not present in most NGC 752 giants, the feature is strong in two giants, Heinemann 77 and 208, log ɛ(Li) = +1.1 and +1.4, respectively. In the remaining giants in NGC 752, log ɛ(Li) < 0.5. The absence of lithium in M67 giants may be because these giants evolve from progenitors in the region of the main-sequence lithium dip.

  16. Check-off logs for routine equipment maintenance.

    PubMed

    Brewster, M A; Carver, P H; Randolph, B

    1995-12-01

    The regulatory requirement for appropriate routine instrument maintenance documentation is approached by laboratories in numerous ways. Standard operating procedures (SOPs) may refer to maintenance listed in instrument manuals, may indicate "periodic" performance of an action, or may indicate specific tasks to be performed at certain frequencies. The Quality Assurance Unit (QAU) task of assuring the performance of these indicated maintenance tasks can be extremely laborious if these records are merged with other analysis records. Further, the lack of written maintenance schedules often leads to omission of infrequently performed tasks. We recommend creation of routine maintenance check-off logs for instruments with tasks grouped by frequency of expected performance. Usage of such logs should result in better laboratory compliance with SOPs and the compliance can be readily monitored by QAU or by regulatory agencies.

  17. Fermi-edge singularity and the functional renormalization group

    NASA Astrophysics Data System (ADS)

    Kugler, Fabian B.; von Delft, Jan

    2018-05-01

    We study the Fermi-edge singularity, describing the response of a degenerate electron system to optical excitation, in the framework of the functional renormalization group (fRG). Results for the (interband) particle-hole susceptibility from various implementations of fRG (one- and two-particle-irreducible, multi-channel Hubbard–Stratonovich, flowing susceptibility) are compared to the summation of all leading logarithmic (log) diagrams, achieved by a (first-order) solution of the parquet equations. For the (zero-dimensional) special case of the x-ray-edge singularity, we show that the leading log formula can be analytically reproduced in a consistent way from a truncated, one-loop fRG flow. However, reviewing the underlying diagrammatic structure, we show that this derivation relies on fortuitous partial cancellations special to the form of and accuracy applied to the x-ray-edge singularity and does not generalize.

  18. Growth and survival of Escherichia coli O157:H7 and Listeria monocytogenes in egg products held at different temperatures.

    PubMed

    Yang, S E; Chou, C C

    2000-07-01

    Growth and survival of Escherichia coli O157:H7 and Listeria monocytogenes in steamed eggs and scrambled eggs held at different temperatures (5, 18, 22, 37, 55, and 60 degrees C) were investigated in the present study. Among the holding temperatures tested, both pathogens multiplied best at 37 degrees C followed by 22, 18, and 5 degrees C. In general, E. coli O157:H7 grew better in the egg products than L. monocytogenes did at all the storage temperatures tested except at 5 degrees C. E. coli O157:H7 did not grow in steamed eggs and scrambled eggs held at 5 degrees C. L. monocytogenes showed a slight population increase of approximately 0.6 to 0.9 log CFU/g in these egg products at the end of the 36-h storage period at 5 degrees C. The population of both pathogens detected in the egg products was affected by the initial population, holding temperature, and length of the holding period. It was also noted that L. monocytogenes was more susceptible than E. coli O157:H7 in steamed eggs held at 60 degrees C. After holding at 60 degrees C for 1 h, no detectable viable cells of L. monocytogenes with a population reduction of 5.4 log CFU/g was observed in steamed eggs, whereas a lower population reduction of only approximately 0.5 log CFU/ml was noted for E. coli O157:H7.

  19. Analytical approximations for effective relative permeability in the capillary limit

    NASA Astrophysics Data System (ADS)

    Rabinovich, Avinoam; Li, Boxiao; Durlofsky, Louis J.

    2016-10-01

    We present an analytical method for calculating two-phase effective relative permeability, krjeff, where j designates phase (here CO2 and water), under steady state and capillary-limit assumptions. These effective relative permeabilities may be applied in experimental settings and for upscaling in the context of numerical flow simulations, e.g., for CO2 storage. An exact solution for effective absolute permeability, keff, in two-dimensional log-normally distributed isotropic permeability (k) fields is the geometric mean. We show that this does not hold for krjeff since log normality is not maintained in the capillary-limit phase permeability field (Kj=k·krj) when capillary pressure, and thus the saturation field, is varied. Nevertheless, the geometric mean is still shown to be suitable for approximating krjeff when the variance of ln⁡k is low. For high-variance cases, we apply a correction to the geometric average gas effective relative permeability using a Winsorized mean, which neglects large and small Kj values symmetrically. The analytical method is extended to anisotropically correlated log-normal permeability fields using power law averaging. In these cases, the Winsorized mean treatment is applied to the gas curves for cases described by negative power law exponents (flow across incomplete layers). The accuracy of our analytical expressions for krjeff is demonstrated through extensive numerical tests, using low-variance and high-variance permeability realizations with a range of correlation structures. We also present integral expressions for geometric-mean and power law average krjeff for the systems considered, which enable derivation of closed-form series solutions for krjeff without generating permeability realizations.

  20. Effects of lactic acid and commercial chilling processes on survival of Salmonella, Yersinia enterocolitica, and Campylobacter coli in pork variety meats.

    PubMed

    King, Amanda M; Miller, Rhonda K; Castillo, Alejandro; Griffin, Davey B; Hardin, Margaret D

    2012-09-01

    Current industry chilling practices with and without the application of 2% L-lactic acid were compared for their effectiveness at reducing levels of Salmonella, Yersinia enterocolitica, and Campylobacter coli on pork variety meats. Pork variety meats (livers, intestines, hearts, and stomachs) were inoculated individually with one of the three pathogens and subjected to five different treatment combinations that included one or more of the following: water wash (25°C), lactic acid spray (2%, 40 to 50°C), chilling (4°C), and freezing (-15°C). Samples were analyzed before treatment, after each treatment step, and after 2, 4, and 6 months of frozen storage. Results showed that when a lactic acid spray was used in combination with water spray, immediate reductions were approximately 0.5 log CFU per sample of Salmonella, 0.8 log CFU per sample of Y. enterocolitica, and 1.1 log CFU per sample of C. coli. Chilling, both alone and in combination with spray treatments, had little effect on pathogens, while freezing resulted in additional 0.5-log CFU per sample reductions in levels of Salmonella and Y. enterocolitica, and an additional 1.0-log CFU per sample reduction in levels of C. coli. While reductions of at least 1 log CFU per sample were observed on variety meats treated with only a water wash and subsequently frozen, samples treated with lactic acid had greater additional reductions than those treated with only a water spray throughout frozen storage. The results of this study suggest that the use of lactic acid as a decontamination intervention, when used in combination with good manufacturing practices during processing, causes significant reductions in levels of Salmonella, Y. enterocolitica, and C. coli on pork variety meats.

  1. Lethal photosensitization of periodontal pathogens by a red-filtered Xenon lamp in vitro.

    PubMed

    Matevski, Donco; Weersink, Robert; Tenenbaum, Howard C; Wilson, Brian; Ellen, Richard P; Lépine, Guylaine

    2003-08-01

    The ability of Helium-Neon (He-Ne) laser irradiation of a photosensitizer to induce localized phototoxic effects that kill periodontal pathogens is well documented and is termed photodynamic therapy (PDT). We investigated the potential of a conventional light source (red-filtered Xenon lamp) to activate toluidine blue O (TBO) in vitro and determined in vitro model parameters that may be used in future in vivo trials. Porphyromonas gingivalis 381 was used as the primary test bacterium. Treatment with a 2.2 J/cm2 light dose and 50 micro g/ml TBO concentration resulted in a bacterial kill of 2.43 +/- 0.39 logs with the He-Ne laser control and 3.34 +/- 0.24 logs with the lamp, a near 10-fold increase (p = 0.028). Increases in light intensity produced significantly higher killing (p = 0.012) that plateaued at 25 mW/cm2. There was a linear relationship between light dose and bacterial killing (r2 = 0.916); as light dose was increased bacterial survival decreased. No such relationship was found for the drug concentrations tested. Addition of serum or blood at 50% v/v to the P. gingivalis suspension prior to irradiation diminished killing from approximately 5 logs to 3 logs at 10 J/cm2. When serum was washed off, killing returned to 5 logs for all species tested except Bacteroides forsythus (3.92 +/- 0.68 logs kill). The data indicate that PDT utilizing a conventional light source is at least as effective as laser-induced treatment in vitro. Furthermore, PDT achieves significant bactericidal activity in the presence of serum and blood when used with the set parameters of 10 J/cm2, 100 mW/cm2 and 12.5 micro g/ml TBO.

  2. Hydrologic properties of coal beds in the Powder River Basin, Montana I. Geophysical log analysis

    USGS Publications Warehouse

    Morin, R.H.

    2005-01-01

    As part of a multidisciplinary investigation designed to assess the implications of coal-bed methane development on water resources for the Powder River Basin of southeastern Montana, six wells were drilled through Paleocene-age coal beds along a 31-km east-west transect within the Tongue River drainage basin. Analysis of geophysical logs obtained in these wells provides insight into the hydrostratigraphic characteristics of the coal and interbedded siliciclastic rocks and their possible interaction with the local stress field. Natural gamma and electrical resistivity logs were effective in distinguishing individual coal beds. Full-waveform sonic logs were used to determine elastic properties of the coal and an attendant estimate of aquifer storage is in reasonable agreement with that computed from a pumping test. Inspection of magnetically oriented images of the borehole walls generated from both acoustic and optical televiewers and comparison with coal cores infer a face cleat orientation of approximately N33??E, in close agreement with regional lineament patterns and the northeast trend of the nearby Tongue River. The local tectonic stress field in this physiographic province as inferred from a nearby 1984 earthquake denotes an oblique strike-slip faulting regime with dominant east-west compression and north-south extension. These stress directions are coincident with those of the primary fracture sets identified from the televiewer logs and also with the principle axes of the drawdown ellipse produced from a complementary aquifer test, but oblique to apparent cleat orientation. Consequently, examination of these geophysical logs within the context of local hydrologic characteristics indicates that transverse transmissivity anisotropy in these coals is predominantly controlled by bedding configuration and perhaps a mechanical response to the contemporary stress field rather than solely by cleat structure.

  3. Alternative hand contamination technique to compare the activities of antimicrobial and nonantimicrobial soaps under different test conditions.

    PubMed

    Fuls, Janice L; Rodgers, Nancy D; Fischler, George E; Howard, Jeanne M; Patel, Monica; Weidner, Patrick L; Duran, Melani H

    2008-06-01

    Antimicrobial hand soaps provide a greater bacterial reduction than nonantimicrobial soaps. However, the link between greater bacterial reduction and a reduction of disease has not been definitively demonstrated. Confounding factors, such as compliance, soap volume, and wash time, may all influence the outcomes of studies. The aim of this work was to examine the effects of wash time and soap volume on the relative activities and the subsequent transfer of bacteria to inanimate objects for antimicrobial and nonantimicrobial soaps. Increasing the wash time from 15 to 30 seconds increased reduction of Shigella flexneri from 2.90 to 3.33 log(10) counts (P = 0.086) for the antimicrobial soap, while nonantimicrobial soap achieved reductions of 1.72 and 1.67 log(10) counts (P > 0.6). Increasing soap volume increased bacterial reductions for both the antimicrobial and the nonantimicrobial soaps. When the soap volume was normalized based on weight (approximately 3 g), nonantimicrobial soap reduced Serratia marcescens by 1.08 log(10) counts, compared to the 3.83-log(10) reduction caused by the antimicrobial soap (P < 0.001). The transfer of Escherichia coli to plastic balls following a 15-second hand wash with antimicrobial soap resulted in a bacterial recovery of 2.49 log(10) counts, compared to the 4.22-log(10) (P < 0.001) bacterial recovery on balls handled by hands washed with nonantimicrobial soap. This indicates that nonantimicrobial soap was less active and that the effectiveness of antimicrobial soaps can be improved with longer wash time and greater soap volume. The transfer of bacteria to objects was significantly reduced due to greater reduction in bacteria following the use of antimicrobial soap.

  4. Simulated-use validation of a sponge ATP method for determining the adequacy of manual cleaning of endoscope channels.

    PubMed

    Alfa, Michelle J; Olson, Nancy

    2016-05-04

    The objective of this study was to validate the relative light unit (RLU) cut-off of adequate cleaning of flexible colonoscopes for an ATP (adenosine tri-phosphate) test kit that used a sponge channel collection method. This was a simulated-use study. The instrument channel segment of a flexible colonoscope was soiled with ATS (artificial test soil) containing approximately 8 Log10 Enterococcus faecalis and Pseudomonas aeruginosa/mL. Full cleaning, partial cleaning and no cleaning were evaluated for ATP, protein and bacterial residuals. Channel samples were collected using a sponge device to assess residual RLUs. Parallel colonoscopes inoculated and cleaned in the same manner were sampled using the flush method to quantitatively assess protein and bacterial residuals. The protein and viable count benchmarks for adequate cleaning were <6.4 ug/cm(2) and <4 Log10 cfu/cm(2). The negative controls for the instrument channel, over the course of the study remained low with on average 14 RLUs, 0.04 ug/cm(2) protein and 0.025 Log10 cfu/cm(2). Partial cleaning resulted in an average of 6601 RLUs, 3.99 ug/cm(2), 5.25 Log10 cfu/cm(2) E. faecalis and 4.48 Log10 cfu/cm(2) P. aeruginosa. After full cleaning, the average RLU was 29 (range 7-71 RLUs) and the average protein, E. faecalis and P. aeruginosa residuals were 0.23 ug/cm(2), 0.79 and 1.61 Log10 cfu/cm(2), respectively. The validated cut-off for acceptable manual cleaning was set at ≤100 RLUs for the sponge collected channel ATP test kit.

  5. Diversity of Medicinal Plants among Different Forest-use Types of the Pakistani Himalaya.

    PubMed

    Adnan, Muhammad; Hölscher, Dirk

    2012-12-01

    Diversity of Medicinal Plants among Different Forest-use Types of the Pakistani Himalaya Medicinal plants collected in Himalayan forests play a vital role in the livelihoods of regional rural societies and are also increasingly recognized at the international level. However, these forests are being heavily transformed by logging. Here we ask how forest transformation influences the diversity and composition of medicinal plants in northwestern Pakistan, where we studied old-growth forests, forests degraded by logging, and regrowth forests. First, an approximate map indicating these forest types was established and then 15 study plots per forest type were randomly selected. We found a total of 59 medicinal plant species consisting of herbs and ferns, most of which occurred in the old-growth forest. Species number was lowest in forest degraded by logging and intermediate in regrowth forest. The most valuable economic species, including six Himalayan endemics, occurred almost exclusively in old-growth forest. Species composition and abundance of forest degraded by logging differed markedly from that of old-growth forest, while regrowth forest was more similar to old-growth forest. The density of medicinal plants positively correlated with tree canopy cover in old-growth forest and negatively in degraded forest, which indicates that species adapted to open conditions dominate in logged forest. Thus, old-growth forests are important as refuge for vulnerable endemics. Forest degraded by logging has the lowest diversity of relatively common medicinal plants. Forest regrowth may foster the reappearance of certain medicinal species valuable to local livelihoods and as such promote acceptance of forest expansion and medicinal plants conservation in the region. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s12231-012-9213-4) contains supplementary material, which is available to authorized users.

  6. A unified assessment of hydrological and biogeochemical responses in research watersheds in Eastern Puerto Rico using runoff-concentration relations

    USGS Publications Warehouse

    Stallard, Robert F.; Murphy, Sheila F.

    2014-01-01

    An examination of the relation between runoff rate, R, and concentration, C, of twelve major constituents in four small watersheds in eastern Puerto Rico demonstrates a consistent pattern of responses. For solutes that are not substantially bioactive (alkalinity, silica, calcium, magnesium, sodium, and chloride), the log(R)–log(C) relation is almost linear and can be described as a weighted average of two sources, bedrock weathering and atmospheric deposition. The slope of the relation for each solute depends on the respective source contributions to the total river load. If a solute were strictly derived from bedrock weathering, the slope would be −0.3 to −0.4, whereas if strictly derived from atmospheric deposition, the slope would be approximately −0.1. The bioactive constituents (dissolved organic carbon, nitrate, sulfate, and potassium), which are recycled by plants and concentrated in shallow soil, demonstrate nearly flat or downward-arched log(R)–log(C) relations. The peak of the arch represents a transition from dominantly soil-matrix flow to near-surface macropore flow, and finally to overland flow. At highest observed R (80 to >90 mm/h), essentially all reactive surfaces have become wetted, and the input rate of C becomes independent of R (log(R)–log(C) slope of –1). The highest R are tenfold greater than any previous study. Slight clockwise hysteresis for many solutes in the rivers with riparian zones or substantial hyporheic flows indicates that these settings may act as mixing end-members. Particulate constituents (suspended sediment and particulate organic carbon) show slight clockwise hysteresis, indicating mobilization of stored sediment during rising stage.

  7. Desorption kinetics of hydrophobic organic chemicals from sediment to water: a review of data and models.

    PubMed

    Birdwell, Justin; Cook, Robert L; Thibodeaux, Louis J

    2007-03-01

    Resuspension of contaminated sediment can lead to the release of toxic compounds to surface waters where they are more bioavailable and mobile. Because the timeframe of particle resettling during such events is shorter than that needed to reach equilibrium, a kinetic approach is required for modeling the release process. Due to the current inability of common theoretical approaches to predict site-specific release rates, empirical algorithms incorporating the phenomenological assumption of biphasic, or fast and slow, release dominate the descriptions of nonpolar organic chemical release in the literature. Two first-order rate constants and one fraction are sufficient to characterize practically all of the data sets studied. These rate constants were compared to theoretical model parameters and functionalities, including chemical properties of the contaminants and physical properties of the sorbents, to determine if the trends incorporated into the hindered diffusion model are consistent with the parameters used in curve fitting. The results did not correspond to the parameter dependence of the hindered diffusion model. No trend in desorption rate constants, for either fast or slow release, was observed to be dependent on K(OC) or aqueous solubility for six and seven orders of magnitude, respectively. The same was observed for aqueous diffusivity and sediment fraction organic carbon. The distribution of kinetic rate constant values was approximately log-normal, ranging from 0.1 to 50 d(-1) for the fast release (average approximately 5 d(-1)) and 0.0001 to 0.1 d(-1) for the slow release (average approximately 0.03 d(-1)). The implications of these findings with regard to laboratory studies, theoretical desorption process mechanisms, and water quality modeling needs are presented and discussed.

  8. Sequential disinfection of E. coli O157:H7 on shredded lettuce leaves by aqueous chlorine dioxide, ozonated water, and thyme essential oil

    NASA Astrophysics Data System (ADS)

    Singh, Nepal; Singh, Rakesh K.; Bhunia, Arun K.; Stroshine, Richard L.; Simon, James E.

    2001-03-01

    There have been numerous studies on effectiveness of different sanitizers for microbial inactivation. However, results obtained from different studies indicate that microorganism cannot be easily removed from fresh cut vegetables because of puncture and cut surfaces with varying surface topographies. In this study, three step disinfection approach was evaluated for inactivation of E. coli O157:H7 on shredded lettuce leaves. Sequential application of thyme oil, ozonated water, and aqueous chlorine dioxide was evaluated in which thyme oil was applied first followed by ozonated water and aqueous chlorine dioxide. Shredded lettuce leaves inoculated with cocktail culture of E. coli O157:H7 (C7927, EDL 933 and 204 P), were washed with ozonated water (15 mg/l for 10min), aqueous chlorine dioxide (10 mg/l,for 10min) and thyme oil suspension (0.1%, v/v for 5min). Washing of lettuce leaves with ozonated water, chlorine dioxide and thyme oil suspension resulted in 0.44, 1.20, and 1.46 log reduction (log10 cfu/g), respectively. However, the sequential treatment achieved approximately 3.13 log reductions (log10 cfu/g). These results demonstrate the efficacy of sequential treatments in decontaminating shredded lettuce leaves containing E. coli O157:H7.

  9. Femtosecond Laser-Assisted Descemetorhexis: A Novel Technique in Descemet Membrane Endothelial Keratoplasty.

    PubMed

    Pilger, Daniel; von Sonnleithner, Christoph; Bertelmann, Eckart; Joussen, Antonia M; Torun, Necip

    2016-10-01

    To explore the feasibility of femtosecond laser-assisted descemetorhexis (DR) to facilitate Descemet membrane endothelial keratoplasty (DMEK) surgery. Six pseudophakic patients suffering from Fuchs' endothelial dystrophy underwent femtosecond laser-assisted DMEK surgery. DR was performed using the LenSx femtosecond laser, followed by manual removal of the Descemet membrane. Optical coherence tomography images were used to measure DR parameters. Patients were followed up for 1 month to examine best corrected visual acuity, endothelial cell loss, flap detachment, and structure of the anterior chamber of the eye. The diameter of the DR approximated the intended diameter closely [mean error of 34 μm (0.45%) and 54 μm (0.67%) in the x- and y-diameter, respectively] and did not require manual correction. The median visual acuity increased from 0.4 logMAR (range 0.6-0.4 logMAR) preoperative to 0.2 logMAR (range 0-0.4 logMAR) postoperative. The median endothelial cell loss was 22% (range 7%-34%). No clinically significant flap detachments were noted. All patients had clear corneas after surgery, and no side effects or damage to structures of the anterior chamber were noted. Femtosecond laser-assisted DR is a safe and precise method for facilitating DMEK surgery.

  10. Inactivation of avirulent Yersinia pestis on food and food contact surfaces by ultraviolet light and freezing.

    PubMed

    Sommers, Christopher H; Sheen, Shiowshuh

    2015-09-01

    Yersinia pestis, the causative agent of plague, can occasionally be contracted as a naso-pharyngeal or gastrointestinal illness through consumption of contaminated meat. In this study, the use of 254 nm ultraviolet light (UV-C) to inactivate a multi-isolate cocktail of avirulent Y. pestis on food and food contact surfaces was investigated. When a commercial UV-C conveyor was used (5 mW/cm(2)/s) 0.5 J/cm(2) inactivated >7 log of the Y. pestis cocktail on agar plates. At 0.5 J/cm(2), UV-C inactivated ca. 4 log of Y. pestis in beef, chicken, and catfish, exudates inoculated onto high density polypropylene or polyethylene, and stainless steel coupons, and >6 log was eliminated at 1 J/cm(2). Approximately 1 log was inactivated on chicken breast, beef steak, and catfish fillet surfaces at a UV-C dose of 1 J/cm(2). UV-C treatment prior to freezing of the foods did not increase the inactivation of Y. pestis over freezing alone. These results indicate that routine use of UV-C during food processing would provide workers and consumers some protection against Y. pestis. Published by Elsevier Ltd.

  11. UV-C light inactivation and modeling kinetics of Alicyclobacillus acidoterrestris spores in white grape and apple juices.

    PubMed

    Baysal, Ayse Handan; Molva, Celenk; Unluturk, Sevcan

    2013-09-16

    In the present study, the effect of short wave ultraviolet light (UV-C) on the inactivation of Alicyclobacillus acidoterrestris DSM 3922 spores in commercial pasteurized white grape and apple juices was investigated. The inactivation of A. acidoterrestris spores in juices was examined by evaluating the effects of UV light intensity (1.31, 0.71 and 0.38 mW/cm²) and exposure time (0, 3, 5, 7, 10, 12 and 15 min) at constant depth (0.15 cm). The best reduction (5.5-log) was achieved in grape juice when the UV intensity was 1.31 mW/cm². The maximum inactivation was approximately 2-log CFU/mL in apple juice under the same conditions. The results showed that first-order kinetics were not suitable for the estimation of spore inactivation in grape juice treated with UV-light. Since tailing was observed in the survival curves, the log-linear plus tail and Weibull models were compared. The results showed that the log-linear plus tail model was satisfactorily fitted to estimate the reductions. As a non-thermal technology, UV-C treatment could be an alternative to thermal treatment for grape juices or combined with other preservation methods for the pasteurization of apple juice. © 2013 Elsevier B.V. All rights reserved.

  12. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values.

    PubMed

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-01-30

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUV max distributions at both pre and post treatment. This study included 57 patients that underwent 18 F-fluorodeoxyglucose ( 18 F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18 F-Fluorothymidine ( 18 F-FLT) PET scans at our institution. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18 F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18 F-FLT PET SUV distributions (P  >  0.10). For both 18 F-FDG and 18 F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18 F-FDG and 18 F-FLT where a log transformation was not optimal for providing normal SUV distributions. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when the log transformation is insufficient. The log transformation is not always the appropriate transformation for producing normally distributed PET SUVs.

  13. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    NASA Astrophysics Data System (ADS)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when the log transformation is insufficient. The log transformation is not always the appropriate transformation for producing normally distributed PET SUVs.

  14. Solar abundance of silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holweger, H.

    1973-07-01

    An analysis of 19 photospheric Si I lines whose oscillator strengths have recently been detertmined by Garz (1973) leads to a solar abundance of silicon, log epsilon /sub Si/ = 7.65 plus or minus 0.07, on the scale where log epsilon /sub H/ = 12. Together with the sodium abundance determained earlier by the same method, a solar abundance ratio /sup epsilon /Na//sup epsilon /Si = 0.045 ( plus or minus 10%) results. Within the error limits this a grees wtth the meteoritic ratio found in carbonaceous chondrites. Results concerning line-broadening by hydrogen are discussed. (auth)

  15. Fast computation of molecular random phase approximation correlation energies using resolution of the identity and imaginary frequency integration

    NASA Astrophysics Data System (ADS)

    Eshuis, Henk; Yarkony, Julian; Furche, Filipp

    2010-06-01

    The random phase approximation (RPA) is an increasingly popular post-Kohn-Sham correlation method, but its high computational cost has limited molecular applications to systems with few atoms. Here we present an efficient implementation of RPA correlation energies based on a combination of resolution of the identity (RI) and imaginary frequency integration techniques. We show that the RI approximation to four-index electron repulsion integrals leads to a variational upper bound to the exact RPA correlation energy if the Coulomb metric is used. Auxiliary basis sets optimized for second-order Møller-Plesset (MP2) calculations are well suitable for RPA, as is demonstrated for the HEAT [A. Tajti et al., J. Chem. Phys. 121, 11599 (2004)] and MOLEKEL [F. Weigend et al., Chem. Phys. Lett. 294, 143 (1998)] benchmark sets. Using imaginary frequency integration rather than diagonalization to compute the matrix square root necessary for RPA, evaluation of the RPA correlation energy requires O(N4 log N) operations and O(N3) storage only; the price for this dramatic improvement over existing algorithms is a numerical quadrature. We propose a numerical integration scheme that is exact in the two-orbital case and converges exponentially with the number of grid points. For most systems, 30-40 grid points yield μH accuracy in triple zeta basis sets, but much larger grids are necessary for small gap systems. The lowest-order approximation to the present method is a post-Kohn-Sham frequency-domain version of opposite-spin Laplace-transform RI-MP2 [J. Jung et al., Phys. Rev. B 70, 205107 (2004)]. Timings for polyacenes with up to 30 atoms show speed-ups of two orders of magnitude over previous implementations. The present approach makes it possible to routinely compute RPA correlation energies of systems well beyond 100 atoms, as is demonstrated for the octapeptide angiotensin II.

  16. Fast computation of molecular random phase approximation correlation energies using resolution of the identity and imaginary frequency integration.

    PubMed

    Eshuis, Henk; Yarkony, Julian; Furche, Filipp

    2010-06-21

    The random phase approximation (RPA) is an increasingly popular post-Kohn-Sham correlation method, but its high computational cost has limited molecular applications to systems with few atoms. Here we present an efficient implementation of RPA correlation energies based on a combination of resolution of the identity (RI) and imaginary frequency integration techniques. We show that the RI approximation to four-index electron repulsion integrals leads to a variational upper bound to the exact RPA correlation energy if the Coulomb metric is used. Auxiliary basis sets optimized for second-order Møller-Plesset (MP2) calculations are well suitable for RPA, as is demonstrated for the HEAT [A. Tajti et al., J. Chem. Phys. 121, 11599 (2004)] and MOLEKEL [F. Weigend et al., Chem. Phys. Lett. 294, 143 (1998)] benchmark sets. Using imaginary frequency integration rather than diagonalization to compute the matrix square root necessary for RPA, evaluation of the RPA correlation energy requires O(N(4) log N) operations and O(N(3)) storage only; the price for this dramatic improvement over existing algorithms is a numerical quadrature. We propose a numerical integration scheme that is exact in the two-orbital case and converges exponentially with the number of grid points. For most systems, 30-40 grid points yield muH accuracy in triple zeta basis sets, but much larger grids are necessary for small gap systems. The lowest-order approximation to the present method is a post-Kohn-Sham frequency-domain version of opposite-spin Laplace-transform RI-MP2 [J. Jung et al., Phys. Rev. B 70, 205107 (2004)]. Timings for polyacenes with up to 30 atoms show speed-ups of two orders of magnitude over previous implementations. The present approach makes it possible to routinely compute RPA correlation energies of systems well beyond 100 atoms, as is demonstrated for the octapeptide angiotensin II.

  17. Reducing Uncertainties in Hydrocarbon Prediction through Application of Elastic Domain

    NASA Astrophysics Data System (ADS)

    Shamsuddin, S. Z.; Hermana, M.; Ghosh, D. P.; Salim, A. M. A.

    2017-10-01

    The application of lithology and fluid indicators has helped the geophysicists to discriminate reservoirs to non-reservoirs from a field. This analysis is conducted to select the most suitable lithology and fluid indicator for the Malaysian basins that could lead to better eliminate pitfalls of amplitude. This paper uses different rock physics analysis such as elastic impedance, Lambda-Mu-Rho, and SQp-SQs attribute. Litho-elastic impedance log is generated by correlating the gamma ray log with extended elastic impedance log. The same application is used for fluid-elastic impedance by correlation of EEI log with water saturation or resistivity. The work is done on several well logging data collected from different fields in Malay basin and its neighbouring basin. There's an excellent separation between hydrocarbon sand and background shale for Well-1 from different cross-plot analysis. Meanwhile, the Well-2 shows good separation in LMR plot. The similar method is done on the Well-3 shows fair separation of silty sand and gas sand using SQp-SQs attribute which can be correlated with well log. Based on the point distribution histogram plot, different lithology and fluid can be separated clearly. Simultaneous seismic inversion results in acoustic impedance, Vp/Vs, SQp, and SQs volumes. There are many attributes available in the industry used to separate the lithology and fluid, however some of the methods are not suitable for the application to the basins in Malaysia.

  18. Improved Inactivation of Nonenveloped Enteric Viruses and Their Surrogates by a Novel Alcohol-Based Hand Sanitizer ▿

    PubMed Central

    Macinga, David R.; Sattar, Syed A.; Jaykus, Lee-Ann; Arbogast, James W.

    2008-01-01

    Norovirus is the leading cause of food-related illness in the United States, and contamination of ready-to-eat items by food handlers poses a high risk for disease. This study reports the in vitro (suspension test) and in vivo (fingerpad protocol) assessments of a new ethanol-based hand sanitizer containing a synergistic blend of polyquaternium polymer and organic acid, which is active against viruses of public health importance, including norovirus. When tested in suspension, the test product reduced the infectivity of the nonenveloped viruses human rotavirus (HRV), poliovirus type 1 (PV-1), and the human norovirus (HNV) surrogates feline calicivirus (FCV) F-9 and murine norovirus type 1 (MNV-1) by greater than 3 log10 after a 30-s exposure. In contrast, a benchmark alcohol-based hand sanitizer reduced only HRV by greater than 3 log10 and none of the additional viruses by greater than 1.2 log10 after the same exposure. In fingerpad experiments, the test product produced a 2.48 log10 reduction of MNV-1 after a 30-s exposure, whereas a 75% ethanol control produced a 0.91 log10 reduction. Additionally, the test product reduced the infectivity titers of adenovirus type 5 (ADV-5) and HRV by ≥3.16 log10 and ≥4.32 log10, respectively, by the fingerpad assay within 15 s; and PV-1 was reduced by 2.98 log10 in 30 s by the same method. Based on these results, we conclude that this new ethanol-based hand sanitizer is a promising option for reducing the transmission of enteric viruses, including norovirus, by food handlers and care providers. PMID:18586970

  19. Improved inactivation of nonenveloped enteric viruses and their surrogates by a novel alcohol-based hand sanitizer.

    PubMed

    Macinga, David R; Sattar, Syed A; Jaykus, Lee-Ann; Arbogast, James W

    2008-08-01

    Norovirus is the leading cause of food-related illness in the United States, and contamination of ready-to-eat items by food handlers poses a high risk for disease. This study reports the in vitro (suspension test) and in vivo (fingerpad protocol) assessments of a new ethanol-based hand sanitizer containing a synergistic blend of polyquaternium polymer and organic acid, which is active against viruses of public health importance, including norovirus. When tested in suspension, the test product reduced the infectivity of the nonenveloped viruses human rotavirus (HRV), poliovirus type 1 (PV-1), and the human norovirus (HNV) surrogates feline calicivirus (FCV) F-9 and murine norovirus type 1 (MNV-1) by greater than 3 log(10) after a 30-s exposure. In contrast, a benchmark alcohol-based hand sanitizer reduced only HRV by greater than 3 log(10) and none of the additional viruses by greater than 1.2 log(10) after the same exposure. In fingerpad experiments, the test product produced a 2.48 log(10) reduction of MNV-1 after a 30-s exposure, whereas a 75% ethanol control produced a 0.91 log(10) reduction. Additionally, the test product reduced the infectivity titers of adenovirus type 5 (ADV-5) and HRV by > or =3.16 log(10) and > or =4.32 log(10), respectively, by the fingerpad assay within 15 s; and PV-1 was reduced by 2.98 log(10) in 30 s by the same method. Based on these results, we conclude that this new ethanol-based hand sanitizer is a promising option for reducing the transmission of enteric viruses, including norovirus, by food handlers and care providers.

  20. Relationship of In Vitro Susceptibility to Moxifloxacin and In Vivo Clinical Outcome in Bacterial Keratitis

    PubMed Central

    Lalitha, Prajna; Srinivasan, Muthiah; Manikandan, P.; Bharathi, M. Jayahar; Rajaraman, Revathi; Ravindran, Meenakshi; Cevallos, Vicky; Oldenburg, Catherine E.; Ray, Kathryn J.; Toutain-Kidd, Christine M.; Glidden, David V.; Zegans, Michael E.; McLeod, Stephen D.; Acharya, Nisha R.; Lietman, Thomas M.

    2012-01-01

    Background. For bacterial infections, the susceptibility to antibiotics in vitro has been associated with clinical outcomes in vivo, although the importance of minimum inhibitory concentration (MIC) has been debated. In this study, we analyzed the association of MIC on clinical outcomes in bacterial corneal ulcers, while controlling for organism and severity of disease at presentation. Methods. Data were collected as part of a National Eye Institute–funded, randomized, controlled trial (the Steroids for Corneal Ulcers Trial [SCUT]). All cases enrolled in SCUT had a culture-positive bacterial corneal ulcer and received moxifloxacin. The MIC to moxifloxacin was measured by E test. Outcomes included best spectacle-corrected visual acuity, infiltrate/scar size, time to re-epithelialization, and corneal perforation. Results. Five hundred patients with corneal ulcers were enrolled in the trial, and 480 were included in this analysis. The most commonly isolated organisms were Streptococcus pneumoniae and Pseudomonas aeruginosa. A 2-fold increase in MIC was associated with an approximately 0.02 logMAR decrease in visual acuity at 3 weeks, approximately 1 letter of vision loss on a Snellen chart (0.019 logMAR; 95% confidence interval [CI], .0040–.033; P = .01). A 2-fold increase in MIC was associated with an approximately 0.04-mm larger infiltrate/scar size at 3 weeks (0.036 mm; 95% CI, .010–.061; P = .006). After controlling for organism, a higher MIC was associated with slower time to re-epithelialization (hazards ratio, 0.92; 95% CI, .86–.97; P = .005). Conclusions. In bacterial keratitis, a higher MIC to the treating antibiotic is significantly associated with worse clinical outcomes, with approximately 1 line of vision loss per 32-fold increase in MIC. Clinical Trials Registration: NCT00324168. PMID:22447793

  1. Relationship of in vitro susceptibility to moxifloxacin and in vivo clinical outcome in bacterial keratitis.

    PubMed

    Lalitha, Prajna; Srinivasan, Muthiah; Manikandan, P; Bharathi, M Jayahar; Rajaraman, Revathi; Ravindran, Meenakshi; Cevallos, Vicky; Oldenburg, Catherine E; Ray, Kathryn J; Toutain-Kidd, Christine M; Glidden, David V; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2012-05-01

    For bacterial infections, the susceptibility to antibiotics in vitro has been associated with clinical outcomes in vivo, although the importance of minimum inhibitory concentration (MIC) has been debated. In this study, we analyzed the association of MIC on clinical outcomes in bacterial corneal ulcers, while controlling for organism and severity of disease at presentation. Data were collected as part of a National Eye Institute-funded, randomized, controlled trial (the Steroids for Corneal Ulcers Trial [SCUT]). All cases enrolled in SCUT had a culture-positive bacterial corneal ulcer and received moxifloxacin. The MIC to moxifloxacin was measured by E test. Outcomes included best spectacle-corrected visual acuity, infiltrate/scar size, time to re-epithelialization, and corneal perforation. Five hundred patients with corneal ulcers were enrolled in the trial, and 480 were included in this analysis. The most commonly isolated organisms were Streptococcus pneumoniae and Pseudomonas aeruginosa. A 2-fold increase in MIC was associated with an approximately 0.02 logMAR decrease in visual acuity at 3 weeks, approximately 1 letter of vision loss on a Snellen chart (0.019 logMAR; 95% confidence interval [CI], .0040-.033; P = .01). A 2-fold increase in MIC was associated with an approximately 0.04-mm larger infiltrate/scar size at 3 weeks (0.036 mm; 95% CI, .010-.061; P = .006). After controlling for organism, a higher MIC was associated with slower time to re-epithelialization (hazards ratio, 0.92; 95% CI, .86-.97; P = .005). In bacterial keratitis, a higher MIC to the treating antibiotic is significantly associated with worse clinical outcomes, with approximately 1 line of vision loss per 32-fold increase in MIC. NCT00324168.

  2. Inferring Clinical Workflow Efficiency via Electronic Medical Record Utilization

    PubMed Central

    Chen, You; Xie, Wei; Gunter, Carl A; Liebovitz, David; Mehrotra, Sanjay; Zhang, He; Malin, Bradley

    2015-01-01

    Complexity in clinical workflows can lead to inefficiency in making diagnoses, ineffectiveness of treatment plans and uninformed management of healthcare organizations (HCOs). Traditional strategies to manage workflow complexity are based on measuring the gaps between workflows defined by HCO administrators and the actual processes followed by staff in the clinic. However, existing methods tend to neglect the influences of EMR systems on the utilization of workflows, which could be leveraged to optimize workflows facilitated through the EMR. In this paper, we introduce a framework to infer clinical workflows through the utilization of an EMR and show how such workflows roughly partition into four types according to their efficiency. Our framework infers workflows at several levels of granularity through data mining technologies. We study four months of EMR event logs from a large medical center, including 16,569 inpatient stays, and illustrate that over approximately 95% of workflows are efficient and that 80% of patients are on such workflows. At the same time, we show that the remaining 5% of workflows may be inefficient due to a variety of factors, such as complex patients. PMID:26958173

  3. Lumber recovery of Douglas-fir from the Coast and Cascade Ranges of Oregon and Washington.

    Treesearch

    Susan Willits; Thomas D. Fahey

    1988-01-01

    This report summarizes the results of lumber recovery studies at four sawmills in western Oregon and western Washington; two dimension mills, one grade mill, and one timber mill were included. Results from individual mills are reported and discussed. The four mills were also combined to approximate "average" conversion of logs to lumber for the region....

  4. Evaluating forest management effects on erosion, sediment, and runoff: Caspar Creek and northwestern California

    Treesearch

    Raymond M. Rice; Robert R. Ziemer; Jack Lewis

    2004-01-01

    The effects of multiple logging disturbances on peak flows and suspended sediment loads from second-growth redwood watersheds were approximately additive. Downstream increases were no greater than would be expected from the proportion of the area disturbed. Annual sediment load increases of from 123 to 269% were measured in tributary watersheds but were not detected at...

  5. "Forest management effects on erosion, sediment, and runoff: Lessons from Caspar Creek and northwestern California"

    Treesearch

    Raymond M. Rice; Robert R. Ziemer; Jack Lewis

    2001-01-01

    Abstract - The effects of multiple logging disturbances on peak flows and suspended sediment loads from second-growth redwood watersheds were approximately additive. Downstream increases were no greater than would be expected from the proportion of the area disturbed. Annual sediment load increases of from 123 to 269% were measured in tributary watersheds but were...

  6. Use of Intelligent Tutor in Post-Secondary Mathematics Education in the United Arab Emirates

    ERIC Educational Resources Information Center

    Dani, Anita; Nasser, Ramzi

    2016-01-01

    The purpose of this paper is to determine potential identifiers of students' academic success in foundation mathematics course from the data logs of the intelligent tutor Assessment for Learning using Knowledge Spaces (ALEKS). A cross-sectional study design was used. A sample of 152 records, which accounts to approximately 60% of the population,…

  7. Biochemical Control of Marine Fouling

    DTIC Science & Technology

    1988-01-14

    characterized directly. The receptors are specifically labeled with tritiated (-)- baclofen ( -chlorophenyl-GABA). This labeling is saturable, reversible (with...no change in the radioactive baclofen molecule), and stereochemically specific. Scatchard and log-logit analyses of binding data indicate that there...are approximately 1010 receptors per larva; the affinity of these receptors for (-)- baclofen is reflected by a KD = 3 x 10-7. [Our original results

  8. Changes in Florida's industrial roundwood products output, 1987-1989

    Treesearch

    Edgar L. Davenport

    1991-01-01

    Nearly 483 million cubic feet of industrial roundwood products were harvested from Florida's forests during 1989, approximately 3 million cubic feet more than in 1987. Pulpwood accounted for 61 percent and saw logs 29 percent of the total roundwood production. Output of byproducts dropped from 170 million cubic feet in 1987 to 161 million cubic feet in 1989. Only...

  9. Direct seeding experiments on the 1951 Forks Burn.

    Treesearch

    Elmer W. Shaw

    1953-01-01

    Late in the summer of 1951 the Port Angeles and Western Railroad fire (commonly called the Forks fire) killed more than a half billion board feet of timber. An area approximately 20 miles long and 2-1/2 miles wide, covering 32,668 acres, was burned. It included fine virgin timber, thrifty plantations, ranch lands, reproduction areas, advanced young growth, logged-off...

  10. Preliminary observations and logs of BARB 1 and BARB 2: komatiites from the Tjakastad site

    NASA Astrophysics Data System (ADS)

    Coetzee, Grace; Arndt, Nicholas; Wilson, Allan

    2013-04-01

    The BARB 1 and BARB 2 cores intersect a suite of komatiite flows and komatiitic basalts as well as fragmental rocks of the Komati Formation of the Onverwacht Group, Barberton Greenstone Belt. The cores give important and previously unattainable information on the structures, textures and contact relationships between individual komatiite flows and different lithological units within the flows. BARB 1 was drilled at -48° on a 5° azimuth to a depth of 419.9 m. This core contains a unique volcanic tumulus succession in the stratigraphically lower 100 m and the rest of the core consists of about 59 flows of spinifex-textured komatiite (1-3 m thick), massive komatiite (0.5-10 m thick), komatiitic basalt (1-9 m thick) and a single basalt layer (10 m thick), intruded by gabbro (0.5-2 m thick) and a single dolerite dyke (18 m thick). BARB 2, approximately 50 m from BARB 1 and parallel to it, was drilled at -45°on an 8° azimuth to a depth of 431.5 m. This core contains approximately 39 flows of komatiite (0.5-10 m thick) and komatiitic basalt (2-23 m thick) which contain possible selvages of pillows. Basalt flows are more numerous (0.3-4 m thick) in BARB 2 whilst gabbro (0.6-7 m thick) is less prevalent. The dolerite dyke observed in BARB 1 does not occur in BARB 2. As the Barberton strata young towards the east, the cores intersected the stratigraphy in a reverse sequence. The cores were drilled such that there exists a 141 m overlap in stratigraphy between them. The section 141 m from the base of BARB 1 should theoretically correlate with the top 141 m of BARB 2. However, this overlap is not evident in the core or in the core logs. A single gabbro layer appears to be lithologically correlatable between both holes. There is no apparent correlation between the pattern of the komatiite flows leading to an initial conclusion that the komatiite flows were not laterally extensive or changed laterally in form over short distances. In both cores the proportion of komatiitic basalt appears to increase with depth. However, chemical analyses indicate that some of the units originally logged as komatiitic basalt are actually komatiite. The rocks have all undergone alteration to serpentine, and in extreme cases are carbonated together with carbonate veins. Despite the alteration, the original spinifex and olivine cumulate textures, as a well as primary volcanic structures, including spectacular hyaloclastite in the cumulus unit, are well preserved. To date 140 samples have been analysed for major and trace elements and controls by olivine and possibly orthopyroxene have been demonstrated.

  11. Amplitude and Frequency Experimental Field Measurements of a Rotating-Imbalance Seismic Source Associated with Changes in Lithology Surrounding a Borehole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephen R. Novascone; Michael J. Anderson; David M. Weinberg

    2003-10-01

    Field measurements of the vibration amplitude of a rotating-imbalance seismic source in a liquid-filled borehole are described. The borehole was a cased oil well that had been characterized by gamma-ray cement bond and compensated neutron litho-density/gamma-ray logs. The well logs indicated an abrupt transition from shale to limestone at a depth of 2638 ft. The vibration amplitude and frequency of a rotating-imbalance seismic source was measured versus applied voltage as the source was raised from 2654 to 2618 ft through the shale–limestone transition. It was observed that the vibration amplitude changed by approximately 10% in magnitude and the frequency changedmore » approximately 15% as the source passed the shale–limestone transition. The measurements were compared to predictions provided by a two-dimensional analytical model of a rotating-imbalance source located in a liquid-filled bore hole. It was observed that the sensitivity of the experimentally measured vibration amplitude of the seismic source to the properties of the surrounding geologic media was an order of magnitude greater than that predicted by the two-dimensional analytical model.« less

  12. Transfer rates of 19 typical pesticides and the relationship with their physicochemical property.

    PubMed

    Chen, Hongping; Pan, Meiling; Pan, Rong; Zhang, Minglu; Liu, Xin; Lu, Chengyin

    2015-01-21

    Determining the transfer rate of pesticides during tea brewing is important to identify the potential exposure risks from pesticide residues in tea. In this study, the transfer rates of 19 typical pesticides from tea to brewing were investigated using gas chromatography tandem mass and ultraperformance liquid chromatography tandem mass. The leaching rates of five pesticides (isocarbophos, triazophos, fenvalerate, buprofezin, and pyridaben) during tea brewing were first reported. The pesticides exhibited different transfer rates; however, this result was not related to residual concentrations and tea types. Pesticides with low octanol-water partition coefficients (Logkow) and high water solubility demonstrated high transfer rates. The transfer rates of pesticides with water solubility > 29 mg L(-1) (or <15 mg L(-1)) were >25% (or <10%), and those of pesticides with LogKow < 1.52 (or >2.48) were >65% (or <35%). This result indicates that water solubility at approximately 20 mg L(-1) and LogKow at approximately 2.0 could be the demarcation lines of transfer rate. The results of this study can be used as a guide in the application of pesticides to tea trees and establishment of maximum residue limits of pesticides in tea to reduce pesticide exposure in humans.

  13. Improved Discrete Approximation of Laplacian of Gaussian

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L., Jr.

    2004-01-01

    An improved method of computing a discrete approximation of the Laplacian of a Gaussian convolution of an image has been devised. The primary advantage of the method is that without substantially degrading the accuracy of the end result, it reduces the amount of information that must be processed and thus reduces the amount of circuitry needed to perform the Laplacian-of- Gaussian (LOG) operation. Some background information is necessary to place the method in context. The method is intended for application to the LOG part of a process of real-time digital filtering of digitized video data that represent brightnesses in pixels in a square array. The particular filtering process of interest is one that converts pixel brightnesses to binary form, thereby reducing the amount of information that must be performed in subsequent correlation processing (e.g., correlations between images in a stereoscopic pair for determining distances or correlations between successive frames of the same image for detecting motions). The Laplacian is often included in the filtering process because it emphasizes edges and textures, while the Gaussian is often included because it smooths out noise that might not be consistent between left and right images or between successive frames of the same image.

  14. Robust Bayesian clustering.

    PubMed

    Archambeau, Cédric; Verleysen, Michel

    2007-01-01

    A new variational Bayesian learning algorithm for Student-t mixture models is introduced. This algorithm leads to (i) robust density estimation, (ii) robust clustering and (iii) robust automatic model selection. Gaussian mixture models are learning machines which are based on a divide-and-conquer approach. They are commonly used for density estimation and clustering tasks, but are sensitive to outliers. The Student-t distribution has heavier tails than the Gaussian distribution and is therefore less sensitive to any departure of the empirical distribution from Gaussianity. As a consequence, the Student-t distribution is suitable for constructing robust mixture models. In this work, we formalize the Bayesian Student-t mixture model as a latent variable model in a different way from Svensén and Bishop [Svensén, M., & Bishop, C. M. (2005). Robust Bayesian mixture modelling. Neurocomputing, 64, 235-252]. The main difference resides in the fact that it is not necessary to assume a factorized approximation of the posterior distribution on the latent indicator variables and the latent scale variables in order to obtain a tractable solution. Not neglecting the correlations between these unobserved random variables leads to a Bayesian model having an increased robustness. Furthermore, it is expected that the lower bound on the log-evidence is tighter. Based on this bound, the model complexity, i.e. the number of components in the mixture, can be inferred with a higher confidence.

  15. Two fast approximate wavelet algorithms for image processing, classification, and recognition

    NASA Astrophysics Data System (ADS)

    Wickerhauser, Mladen V.

    1994-07-01

    We use large libraries of template waveforms with remarkable orthogonality properties to recast the relatively complex principal orthogonal decomposition (POD) into an optimization problem with a fast solution algorithm. Then it becomes practical to use POD to solve two related problems: recognizing or classifying images, and inverting a complicated map from a low-dimensional configuration space to a high-dimensional measurement space. In the case where the number N of pixels or measurements is more than 1000 or so, the classical O(N3) POD algorithms becomes very costly, but it can be replaced with an approximate best-basis method that has complexity O(N2logN). A variation of POD can also be used to compute an approximate Jacobian for the complicated map.

  16. Effect of packaging and storage time on survival of Listeria monocytogenes on kippered beef steak and turkey tenders.

    PubMed

    Uppal, Kamaldeep K; Getty, Kelly J K; Boyle, Elizabeth A E; Harper, Nigel M; Lobaton-Sulabo, April Shayne S; Barry, Bruce

    2012-01-01

    The objective of our study was to determine effect of packaging method and storage time on reducing Listeria monocytogenes in shelf-stable meat snacks. Commercially available kippered beef steak strips and turkey tenders were dipped into a 5-strain L. monocytogenes cocktail, and dried at 23 °C until a water activity of 0.80 was achieved. Inoculated samples were packaged with 4 treatments: (1) vacuum, (2) nitrogen flushed with oxygen scavenger, (3) heat sealed with oxygen scavenger, and (4) heat sealed without oxygen scavenger. Samples were stored at 23 °C and evaluated for L. monocytogenes levels at 0, 24, 48, and 72 h. Initial levels (time 0) of L. monocytogenes were approximately 5.7 log CFU/cm² for steak and tenders. After 24 h of storage time, a 1 log CFU/cm² reduction of L. monocytogenes was observed for turkey tenders for all packaging treatments. After 48 h, turkey tenders showed >1 log CFU/cm² reduction of L. monocytogenes for all packaging treatments except for vacuum, where only 0.9 log CFU/cm² reduction was observed. After 72 h, reductions for all packaging treatments for turkey tenders ranged from 1.5 to 2.4 log CFU/cm². For kippered beef steak, there was no interaction between the packaging treatments and all storage times (P > 0.05) whereas, time was different (P <0.05). For kippered beef steak, there was 1 log reduction of L. monocytogenes at 24 and 48 h of storage times at 23 °C for all packaging treatments and a 2.1 log CFU/ cm² L. monocytogenes reduction at 72 h of storage time. Processors of kippered beef steak and turkey tenders could use a combination of vacuum or nitrogen-flushing or heat sealed with an oxygen scavenger packaging methods and a holding time of 24 h prior to shipping to reduce potential L. monocytogenes numbers by ≥1 log. However, processors should be encouraged to hold packaged product a minimum of 72 h to enhance the margin of safety for L. monocytogenes control. © 2011 Institute of Food Technologists®

  17. Mud Gas Logging In A Deep Borehole: IODP Site C0002, Nankai Trough Accretionary Prism

    NASA Astrophysics Data System (ADS)

    Toczko, S.; Hammerschmidt, S.; Maeda, L.

    2014-12-01

    Mud logging, a tool in riser drilling, makes use of the essentially "closed-circuit" drilling mud flow between the drilling platform downhole to the bit and then back to the platform for analyses of gas from the formation in the drilling mud, cuttings from downhole, and a range of safety and operational parameters to monitor downhole drilling conditions. Scientific riser drilling, with coincident control over drilling mud, downhole pressure, and returning drilling mud analyses, has now been in use aboard the scientific riser drilling vessel Chikyu since 2009. International Ocean Discovery Program (IODP) Expedition 348, as part of the goal of reaching the plate boundary fault system near ~5000 mbsf, has now extended the deep riser hole (Hole C0002 N & P) to 3058.5 mbsf. The mud gas data discussed here are from two approximately parallel boreholes, one a kick-off from the other; 860-2329 mbsf (Hole C0002N) and 2163-3058 mbsf (Hole C0002P). An approximate overlap of 166 m between the holes allows for some slight depth comparison between the two holes. An additional 55 m overlap at the top of Hole C0002P exists where a 10-5/8-inch hole was cored, and then opened to 12-1/4-inch with logging while drilling (LWD) tools (Fig. 1). There are several fault zones revealed by LWD data, confirmed in one instance by coring. One of the defining formation characteristics of Holes C0002 N/P are the strongly dipping bedding planes, typically exceeding 60º. These fault zones and bedding planes can influence the methane/ethane concentrations found in the returning drilling mud. A focused comparison of free gas in drilling mud between one interval in Hole C0002 P, drilled first with a 10 5/8-inch coring bit and again with an 12 ¼-inch logging while drilling (LWD) bit is shown. Hole C0002N above this was cased all the way from the sea floor to the kick-off section. A fault interval (in pink) was identified from the recovered core section and from LWD resistivity and gamma. The plot of methane and ethane free gas (C1 and C2; ppmv) shows that the yield of free gas (primarily methane) was greater when the LWD bit returned to open the cored hole to a greater diameter. One possible explanation for this is the time delay between coring and LWD operations; approximately 3 days passed between the end of coring and the beginning of LWD (25-28 December 2013).

  18. Effect of disease management on prescription drug treatment: what is the right quality measure?

    PubMed

    Mattke, Soeren; Jain, Arvind K; Sloss, Elizabeth M; Hirscher, Randy; Bergamo, Giacomo; O'Leary, June F

    2007-04-01

    Measures of medication adherence have become common parameters with which disease management (DM) programs are being evaluated, leading to the question of how this concept should be measured in the particular context of a DM intervention. We hypothesize that DM improves adherence to prescriptions more than the rate with which prescriptions are being filled. We used health plan claims data to construct 13 common measures of medication adherence for five chronic conditions. The measures were operationalized in three different ways: the Prescription Fill Rate (PFR), which requires only one prescription; the Medication Possession Ratio (MPR), which requires a supply that covers at least 80% of the year; and the Length of Gap (LOG), which requires no gap greater than 30 days between prescriptions. We compared results from a baseline year to results during the first year of a DM program. Changes in adherence were quite small in the first year of the intervention, with no changes greater than six percentage points. In the intervention year, three measures showed a significant increase based on all three operational definitions, but two measures paradoxically decreased based on the PFR. For both, the MPR and the LOG suggested either no change or significant improvement. None of the MPR and LOG measures pointed toward significantly lower compliance in the intervention year. Different ways to operationalize the concept of medication adherence can lead to fundamentally different conclusions. While more complex, MPR- and LOG-based measures could be more appropriate for DM evaluation. Our initial results, however, need to be confirmed by data covering longer term follow-up.

  19. Polysaccharide-Based Edible Coatings Containing Cellulase for Improved Preservation of Meat Quality during Storage.

    PubMed

    Zimoch-Korzycka, Anna; Jarmoluk, Andrzej

    2017-03-02

    The objectives of this study were to optimize the composition of edible food coatings and to extend the shelf-life of pork meat. Initially, nine meat samples were coated with solutions containing chitosan and hydroxypropyl methylcellulose at various cellulase concentrations: 0%, 0.05%, and 0.1%, stored for 0, 7, and 14 days. Uncoated meat served as the controls. The samples were tested for pH, water activity (a w ), total number of microorganisms (TNM), psychrotrophs (P), number of yeast and molds (NYM), colour, and thiobarbituric acid-reactive substances (TBARS). The pH and a w values varied from 5.42 to 5.54 and 0.919 to 0.926, respectively. The reductions in the TNM, P, and NYM after 14 days of storage were approximately 2.71 log cycles, 1.46 log cycles, and 0.78 log cycles, respectively. The enzyme addition improved the stability of the red colour. Significant reduction in TBARS was noted with the inclusion of cellulase in the coating material. Overall, this study provides a promising alternative method for the preservation of pork meat in industry.

  20. Effect of benzalkonium chloride on viability and energy metabolism in exponential- and stationary-growth-phase cells of Listeria monocytogenes.

    PubMed

    Luppens, S B; Abee, T; Oosterom, J

    2001-04-01

    The difference in killing exponential- and stationary-phase cells of Listeria monocytogenes by benzalkonium chloride (BAC) was investigated by plate counting and linked to relevant bioenergetic parameters. At a low concentration of BAC (8 mg liter(-1)), a similar reduction in viable cell numbers was observed for stationary-phase cells and exponential-phase cells (an approximately 0.22-log unit reduction), although their membrane potential and pH gradient were dissipated. However, at higher concentrations of BAC, exponential-phase cells were more susceptible than stationary-phase cells. At 25 mg liter(-1), the difference in survival on plates was more than 3 log units. For both types of cells, killing, i.e., more than 1-log unit reduction in survival on plates, coincided with complete inhibition of acidification and respiration and total depletion of ATP pools. Killing efficiency was not influenced by the presence of glucose, brain heart infusion medium, or oxygen. Our results suggest that growth phase is one of the major factors that determine the susceptibility of L. monocytogenes to BAC.

  1. Epicormic branching on eight species of Appalachian hardwoods

    Treesearch

    H. Clay Smith

    1966-01-01

    Epicormic branches and associated defects are leading causes of degrade and value loss in lumber sawed from hardwood logs. The degrade may be in the form of small knots, ingrown bark, wood blemishes, and/or rot.

  2. ELASTIC NET FOR COX'S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM.

    PubMed

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox's proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox's proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems.

  3. Cyclic injection, storage, and withdrawal of heated water in a sandstone aquifer at St. Paul, Minnesota: Field observations, preliminary model analysis, and aquifer thermal efficiency

    USGS Publications Warehouse

    Miller, Robert T.

    1989-01-01

    The Franconia-Ironton-Galesville aquifer is a consolidated sandstone, approximately 60 m thick, the top of which is approximately 180 m below the land surface. It is confined above by the St. Lawrence Formation--a dolomitic sandstone 8-m thick--and below by the Eau Claire Formation--a shale 30-m thick. Initial hydraulic testing with inflatable packers indicated that the aquifer has four hydraulic zones with distinctly different values of relative horizontal hydraulic conductivity. The thickness of each zone was determined by correlating data from geophysical logs, core samples, and the inflatablepacker tests.

  4. Stepwise shockwave velocity determinator

    NASA Technical Reports Server (NTRS)

    Roth, Timothy E.; Beeson, Harold

    1992-01-01

    To provide an uncomplicated and inexpensive method for measuring the far-field velocity of a surface shockwave produced by an explosion, a stepwise shockwave velocity determinator (SSVD) was developed. The velocity determinator is constructed of readily available materials and works on the principle of breaking discrete sensors composed of aluminum foil contacts. The discrete sensors have an average breaking threshold of approximately 7 kPa. An incremental output step of 250 mV is created with each foil contact breakage and is logged by analog-to-digital instrumentation. Velocity data obtained from the SSVD is within approximately 11 percent of the calculated surface shockwave velocity of a muzzle blast from a 30.06 rifle.

  5. Effects of irradiation and fumaric acid treatment on the inactivation of Listeria monocytogenes and Salmonella typhimurium inoculated on sliced ham

    NASA Astrophysics Data System (ADS)

    Song, Hyeon-Jeong; Lee, Ji-Hye; Song, Kyung Bin

    2011-11-01

    To examine the effects of fumaric acid and electron beam irradiation on the inactivation of foodborne pathogens in ready-to-eat meat products, sliced ham was inoculated with Listeria monocytogenes and Salmonella typhimurium. The inoculated ham slices were treated with 0.5% fumaric acid or electron beam irradiation at 2 kGy. Fumaric acid treatment reduced the populations of L. monocytogenes and S. typhimurium by approximately 1 log CFU/g compared to control populations. In contrast, electron beam irradiation decreased the populations of S. typhimurium and L. monocytogenes by 3.78 and 2.42 log CFU/g, respectively. These results suggest that electron beam irradiation is a better and appropriate technique for improving the microbial safety of sliced ham.

  6. Measurement and correlation of jet fuel viscosities at low temperatures

    NASA Technical Reports Server (NTRS)

    Schruben, D. L.

    1985-01-01

    Apparatus and procedures were developed to measure jet fuel viscosity for eight current and future jet fuels at temperatures from ambient to near -60 C by shear viscometry. Viscosity data showed good reproducibility even at temperatures a few degrees below the measured freezing point. The viscosity-temperature relationship could be correlated by two linear segments when plotted as a standard log-log type representation (ASTM D 341). At high temperatures, the viscosity-temperature slope is low. At low temperatures, where wax precipitation is significant, the slope is higher. The breakpoint between temperature regions is the filter flow temperature, a fuel characteristic approximated by the freezing point. A generalization of the representation for the eight experimental fuels provided a predictive correlation for low-temperature viscosity, considered sufficiently accurate for many design or performance calculations.

  7. Possible Lack of Low-Mass Meteoroids in the Earth's Meteoroid Flux Due to Space Erosion?

    NASA Technical Reports Server (NTRS)

    Rubincam, David Parry

    2017-01-01

    The Earth's cumulative meteoroid flux, as found by Halliday et al. (1996), may have a shallower slope for meteoroid masses in the range 0.1-2.5 kg compared to those with masses greater than 2.5 kg when plotted on a log flux vs. log mass graph. This would indicate a lack of low-mass objects. While others such as Ceplecha (1992) find no shallow slope, there may be a reason for a lack of 0.1-2.5 kg meteoroids which supports Halliday et al.'s finding. Simple models show that a few centimeters of space erosion in stony meteoroids can reproduce the bend in Halliday et al.'s curve at approximately 2.5 kg and give the shallower slope.

  8. Evaluation of a new monochloramine generation system for controlling Legionella in building hot water systems.

    PubMed

    Duda, Scott; Kandiah, Sheena; Stout, Janet E; Baron, Julianne L; Yassin, Mohamed; Fabrizio, Marie; Ferrelli, Juliet; Hariri, Rahman; Wagener, Marilyn M; Goepfert, John; Bond, James; Hannigan, Joseph; Rogers, Denzil

    2014-11-01

    To evaluate the efficacy of a new monochloramine generation system for control of Legionella in a hospital hot water distribution system. A 495-bed tertiary care hospital in Pittsburgh, Pennsylvania. The hospital has 12 floors covering approximately 78,000 m(2). The hospital hot water system was monitored for a total of 29 months, including a 5-month baseline sampling period prior to installation of the monochloramine system and 24 months of surveillance after system installation (postdisinfection period). Water samples were collected for microbiological analysis (Legionella species, Pseudomonas aeruginosa, Stenotrophomonas maltophilia, Acinetobacter species, nitrifying bacteria, heterotrophic plate count [HPC] bacteria, and nontuberculous mycobacteria). Chemical parameters monitored during the investigation included monochloramine, chlorine (free and total), nitrate, nitrite, total ammonia, copper, silver, lead, and pH. A significant reduction in Legionella distal site positivity was observed between the pre- and postdisinfection periods, with positivity decreasing from an average of 53% (baseline) to an average of 9% after monochloramine application (P<0.5]). Although geometric mean HPC concentrations decreased by approximately 2 log colony-forming units per milliliter during monochloramine treatment, we did not observe significant changes in other microbial populations. This is the first evaluation in the United States of a commercially available monochloramine system installed on a hospital hot water system for Legionella disinfection, and it demonstrated a significant reduction in Legionella colonization. Significant increases in microbial populations or other negative effects previously associated with monochloramine use in large municipal cold water systems were not observed.

  9. Behavioral evaluation of visual function of rats using a visual discrimination apparatus.

    PubMed

    Thomas, Biju B; Samant, Deedar M; Seiler, Magdalene J; Aramant, Robert B; Sheikholeslami, Sharzad; Zhang, Kevin; Chen, Zhenhai; Sadda, SriniVas R

    2007-05-15

    A visual discrimination apparatus was developed to evaluate the visual sensitivity of normal pigmented rats (n=13) and S334ter-line-3 retinal degenerate (RD) rats (n=15). The apparatus is a modified Y maze consisting of two chambers leading to the rats' home cage. Rats were trained to find a one-way exit door leading into their home cage, based on distinguishing between two different visual alternatives (either a dark background or black and white stripes at varying luminance levels) which were randomly displayed on the back of each chamber. Within 2 weeks of training, all rats were able to distinguish between these two visual patterns. The discrimination threshold of normal pigmented rats was a luminance level of -5.37+/-0.05 log cd/m(2); whereas the threshold level of 100-day-old RD rats was -1.14+/-0.09 log cd/m(2) with considerable variability in performance. When tested at a later age (about 150 days), the threshold level of RD rats was significantly increased (-0.82+/-0.09 log cd/m(2), p<0.03, paired t-test). This apparatus could be useful to train rats at a very early age to distinguish between two different visual stimuli and may be effective for visual functional evaluations following therapeutic interventions.

  10. Approximate l-fold cross-validation with Least Squares SVM and Kernel Ridge Regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Richard E; Zhang, Hao; Parker, Lynne Edwards

    2013-01-01

    Kernel methods have difficulties scaling to large modern data sets. The scalability issues are based on computational and memory requirements for working with a large matrix. These requirements have been addressed over the years by using low-rank kernel approximations or by improving the solvers scalability. However, Least Squares Support VectorMachines (LS-SVM), a popular SVM variant, and Kernel Ridge Regression still have several scalability issues. In particular, the O(n^3) computational complexity for solving a single model, and the overall computational complexity associated with tuning hyperparameters are still major problems. We address these problems by introducing an O(n log n) approximate l-foldmore » cross-validation method that uses a multi-level circulant matrix to approximate the kernel. In addition, we prove our algorithm s computational complexity and present empirical runtimes on data sets with approximately 1 million data points. We also validate our approximate method s effectiveness at selecting hyperparameters on real world and standard benchmark data sets. Lastly, we provide experimental results on using a multi-level circulant kernel approximation to solve LS-SVM problems with hyperparameters selected using our method.« less

  11. Role of hydraulic retention time and granular medium in microbial removal in tertiary treatment reed beds.

    PubMed

    García, Joan; Vivar, Joan; Aromir, Maria; Mujeriego, Rafael

    2003-06-01

    The main objective of this paper is to evaluate the role of hydraulic retention time (HRT) and granular medium in faecal coliform (FC) and somatic coliphage (SC) removal in tertiary reed beds. Experiments were carried out in a pilot plant with four parallel reed beds (horizontal subsurface flow constructed wetlands), each one containing a different type of granular medium. This pilot plant is located in a wastewater treatment plant in Montcada i Reixac, near Barcelona, in northeastern Spain. The microbial inactivation ratios obtained in the different beds are compared as a function of three selected HRTs. Secondary effluent from the wastewater treatment plant was used as the influent of the pilot system. The microbial inactivation ratio ranged between 0.1 and 2.7 log-units for FC and from 0.5 to 1.7 log-units for SC in beds with coarser granular material (5-25mm), while it ranged between 0.7 and 3.4 log-units for FC and from 0.9 to 2.6 log-units for SC in the bed with finer material (2-13mm). HRT and granular medium are both key factors in microbial removal in the tertiary reed beds. The microbial inactivation ratio rises as the HRT increases until it reaches a saturation value (in general at an HRT of 3 days). The value of the microbial inactivation ratio at the saturation level depends on the granular medium contained in the bed. The specific surface area necessary to reach 2-3 log-units of FC and SC is approximately 3m(2)/person-equivalent.

  12. Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.

    PubMed

    Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S

    2015-10-01

    On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  13. Boulder-Faced Log Dams and other Alternatives for Gabion Check Dams in First-Order Ephemeral Streams with Coarse Bed Load in Ethiopia

    NASA Astrophysics Data System (ADS)

    Nyssen, Jan; Gebreslassie, Seifu; Assefa, Romha; Deckers, Jozef; Guyassa, Etefa; Poesen, Jean; Frankl, Amaury

    2017-04-01

    Many thousands of gabion check dams have been installed to control gully erosion in Ethiopia, but several challenges still remain, such as the issue of gabion failure in ephemeral streams with coarse bed load, that abrades at the chute step. As an alternative for gabion check dams in torrents with coarse bed load, boulder-faced log dams were conceived, installed transversally across torrents and tested (n = 30). For this, logs (22-35 cm across) were embedded in the banks of torrents, 0.5-1 m above the bed and their upstream sides were faced with boulders (0.3-0.7 m across). Similar to gabion check dams, boulder-faced log dams lead to temporary ponding, spreading of peak flow over the entire channel width and sediment deposition. Results of testing under extreme flow conditions (including two storms with return periods of 5.6 and 7 years) show that 18 dams resisted strong floods. Beyond certain flood thresholds, represented by proxies such as Strahler's stream order, catchment area, D95 or channel width), 11 log dams were completely destroyed. Smallholder farmers see much potential in this type of structure to control first-order torrents with coarse bed load, since the technique is cost-effective and can be easily installed.

  14. Maximum likelihood solution for inclination-only data in paleomagnetism

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2010-08-01

    We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.

  15. CRISPR/Cas9-Mediated Gene Disruption Reveals the Importance of Zinc Metabolism for Fitness of the Dimorphic Fungal Pathogen Blastomyces dermatitidis

    PubMed Central

    Kujoth, Gregory C.; Sullivan, Thomas D.; Merkhofer, Richard; Lee, Taek-Jin; Wang, Huafeng; Brandhorst, Tristan; Wüthrich, Marcel

    2018-01-01

    ABSTRACT Blastomyces dermatitidis is a human fungal pathogen of the lung that can lead to disseminated disease in healthy and immunocompromised individuals. Genetic analysis of this fungus is hampered by the relative inefficiency of traditional recombination-based gene-targeting approaches. Here, we demonstrate the feasibility of applying CRISPR/Cas9-mediated gene editing to Blastomyces, including to simultaneously target multiple genes. We created targeting plasmid vectors expressing Cas9 and either one or two single guide RNAs and introduced these plasmids into Blastomyces via Agrobacterium gene transfer. We succeeded in disrupting several fungal genes, including PRA1 and ZRT1, which are involved in scavenging and uptake of zinc from the extracellular environment. Single-gene-targeting efficiencies varied by locus (median, 60% across four loci) but were approximately 100-fold greater than traditional methods of Blastomyces gene disruption. Simultaneous dual-gene targeting proceeded with efficiencies similar to those of single-gene-targeting frequencies for the respective targets. CRISPR/Cas9 disruption of PRA1 or ZRT1 had a variable impact on growth under zinc-limiting conditions, showing reduced growth at early time points in low-passage-number cultures and growth similar to wild-type levels by later passage. Individual impairment of PRA1 or ZRT1 resulted in a reduction of the fungal burden in a mouse model of Blastomyces infection by a factor of ~1 log (range, up to 3 logs), and combined disruption of both genes had no additional impact on the fungal burden. These results underscore the utility of CRISPR/Cas9 for efficient gene disruption in dimorphic fungi and reveal a role for zinc metabolism in Blastomyces fitness in vivo. PMID:29615501

  16. Improving gross count gamma-ray logging in uranium mining with the NGRS probe

    NASA Astrophysics Data System (ADS)

    Carasco, C.; Pérot, B.; Ma, J.-L.; Toubon, H.; Dubille-Auchère, A.

    2018-01-01

    AREVA Mines and the Nuclear Measurement Laboratory of CEA Cadarache are collaborating to improve the sensitivity and precision of uranium concentration measurement by means of gamma ray logging. The determination of uranium concentration in boreholes is performed with the Natural Gamma Ray Sonde (NGRS) based on a NaI(Tl) scintillation detector. The total gamma count rate is converted into uranium concentration using a calibration coefficient measured in concrete blocks with known uranium concentration in the AREVA Mines calibration facility located in Bessines, France. Until now, to take into account gamma attenuation in a variety of boreholes diameters, tubing materials, diameters and thicknesses, filling fluid densities and compositions, a semi-empirical formula was used to correct the calibration coefficient measured in Bessines facility. In this work, we propose to use Monte Carlo simulations to improve gamma attenuation corrections. To this purpose, the NGRS probe and the calibration measurements in the standard concrete blocks have been modeled with MCNP computer code. The calibration coefficient determined by simulation, 5.3 s-1.ppmU-1 ± 10%, is in good agreement with the one measured in Bessines, 5.2 s-1.ppmU-1. Based on the validated MCNP model, several parametric studies have been performed. For instance, the rock density and chemical composition proved to have a limited impact on the calibration coefficient. However, gamma self-absorption in uranium leads to a nonlinear relationship between count rate and uranium concentration beyond approximately 1% of uranium weight fraction, the underestimation of the uranium content reaching more than a factor 2.5 for a 50 % uranium weight fraction. Next steps will concern parametric studies with different tubing materials, diameters and thicknesses, as well as different borehole filling fluids representative of real measurement conditions.

  17. Straightening Beta: Overdispersion of Lethal Chromosome Aberrations following Radiotherapeutic Doses Leads to Terminal Linearity in the Alpha–Beta Model

    PubMed Central

    Shuryak, Igor; Loucas, Bradford D.; Cornforth, Michael N.

    2017-01-01

    Recent technological advances allow precise radiation delivery to tumor targets. As opposed to more conventional radiotherapy—where multiple small fractions are given—in some cases, the preferred course of treatment may involve only a few (or even one) large dose(s) per fraction. Under these conditions, the choice of appropriate radiobiological model complicates the tasks of predicting radiotherapy outcomes and designing new treatment regimens. The most commonly used model for this purpose is the venerable linear-quadratic (LQ) formalism as it applies to cell survival. However, predictions based on the LQ model are frequently at odds with data following very high acute doses. In particular, although the LQ predicts a continuously bending dose–response relationship for the logarithm of cell survival, empirical evidence over the high-dose region suggests that the survival response is instead log-linear with dose. Here, we show that the distribution of lethal chromosomal lesions among individual human cells (lymphocytes and fibroblasts) exposed to gamma rays and X rays is somewhat overdispersed, compared with the Poisson distribution. Further, we show that such overdispersion affects the predicted dose response for cell survival (the fraction of cells with zero lethal lesions). This causes the dose response to approximate log-linear behavior at high doses, even when the mean number of lethal lesions per cell is well fitted by the continuously curving LQ model. Accounting for overdispersion of lethal lesions provides a novel, mechanistically based explanation for the observed shapes of cell survival dose responses that, in principle, may offer a tractable and clinically useful approach for modeling the effects of high doses per fraction. PMID:29312888

  18. Hydrogeology and tritium transport in Chicken Creek Canyon,Lawrence Berkeley National Laboratory, Berkeley, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Preston D.; Javandel, Iraj

    This study of the hydrogeology of Chicken Creek Canyon wasconducted by the Environmental Restoration Program (ERP) at LawrenceBerkeley National Laboratory (LBNL). This canyon extends downhill fromBuilding 31 at LBNL to Centennial Road below. The leading edge of agroundwater tritium plume at LBNL is located at the top of the canyon.Tritium activities measured in this portion of the plume during thisstudy were approximately 3,000 picocuries/liter (pCi/L), which issignificantly less than the maximum contaminant level (MCL) for drinkingwaterof 20,000 pCi/L established by the Environmental ProtectionAgency.There are three main pathways for tritium migration beyond theLaboratory s boundary: air, surface water and groundwater flow.more » Thepurpose of this report is to evaluate the groundwater pathway.Hydrogeologic investigation commenced with review of historicalgeotechnical reports including 35 bore logs and 27 test pit/trench logsas well as existing ERP information from 9 bore logs. This was followedby field mapping of bedrock outcrops along Chicken Creek as well asbedrock exposures in road cuts on the north and east walls of the canyon.Water levels and tritium activities from 6 wells were also considered.Electrical-resistivity profiles and cone penetration test (CPT) data werecollected to investigate the extent of an interpreted alluvial sandencountered in one of the wells drilled in this area. Subsequent loggingof 7 additional borings indicated that this sand was actually anunusually well-sorted and typically deeply weathered sandstone of theOrinda Formation. Wells were installed in 6 of the new borings to allowwater level measurement and analysis of groundwater tritium activity. Aslug test and pumping tests were also performed in the wellfield.« less

  19. Seniors in Cyberspace. Trends and Issues Alerts.

    ERIC Educational Resources Information Center

    Imel, Susan

    Approximately 15% (7.6 million) of the estimated 50.6 million U.S. citizens who browse the World Wide Web are aged 50 or older, and 30% of adults aged 55-75 own a computer. Although many older adults initially log on to the Internet as a means of connecting with friends and family, they quickly learn that it is also a valuable source of…

  20. Potentiometric pH Measurements of Acidity Are Approximations, Some More Useful than Others

    ERIC Educational Resources Information Center

    de Levie, Robert

    2010-01-01

    A recent article by McCarty and Vitz "demonstrating that it is not true that pH = -log[H+]" is examined critically. Then, the focus shifts to underlying problems with the IUPAC definition of pH. It is shown how the potentiometric method can provide "estimates" of both the IUPAC-defined hydrogen activity "and" the hydrogen ion concentration, using…

  1. Seed fall and regeneration from a group selection cut . . . first year results

    Treesearch

    Philip M. McDonald

    1966-01-01

    To approximate a group selection cut at the Challenge Experimental Forest, 48 small openings of three sizes—30, 60, and 90 feet in diameter—were logged in 1963. One aim was to create conditions of light and soil moisture that would favor establishment and growth of Douglas-fir, sugar pine, and white fir over ponderosa pine. Seed fall and first-year...

  2. Structural lumber laminated from 1/4 -inch rotary-peeled southern pine veneer

    Treesearch

    P. Koch

    1973-01-01

    By the lamination process evaluated, 60 percent of total log volume ended as kiln-dry, end-trimmed, sized, salable 2 by 4's-approximately 50 percent more than that achieved by conventional bandsawing of matched lop. Moreover, modulus of elasticity of the laminated 2 by 4's (adjusted to 12 percent moisture content) averaged 1,950,-000 psi compared to 1,790,...

  3. Efficacy of Gaseous Ozone Application during Vacuum Cooling against Escherichia coli O157:H7 on Spinach Leaves as Influenced by Bacterium Population Size.

    PubMed

    Yesil, Mustafa; Kasler, David R; Huang, En; Yousef, Ahmed E

    2017-07-01

    Foodborne disease outbreaks associated with the consumption of fresh produce pose a threat to public health, decrease consumer confidence in minimally processed foods, and negatively impact the sales of these commodities. The aim of the study was to determine the influence of population size of inoculated pathogen on its inactivation by gaseous ozone treatment during vacuum cooling. Spinach leaves were spot inoculated with Escherichia coli O157:H7 at approximate initial populations of 10 8 , 10 7 , and 10 5 CFU/g. Inoculated leaves were vacuum cooled (28.5 inHg; 4°C) in a custom-made vessel and then were subjected to a gaseous ozone treatment under the following conditions: 1.5 g of ozone per kg of gas mixture, vessel pressure at 10 lb/in 2 gauge, 94 to 98% relative humidity, and 30 min of holding time at 9°C. Treatment of the leaves, having the aforementioned inocula, decreased E. coli populations by 0.2, 2.1, and 2.8 log CFU/g, respectively, compared with the inoculated untreated controls. Additionally, spinach leaves were inoculated at 1.4 × 10 3 CFU/g, which approximates natural contamination level, and the small populations remaining after ozone treatment were quantified using the most-probable-number (MPN) method. Vacuum and ozone sequential treatment decreased this E. coli O157:H7 population to <3 MPN/g (i.e., greater than 3-log reduction). Resulting log reductions were greater (P < 0.05) at the lower rather than the higher inoculum levels. In conclusion, treatment of spinach leaves with gaseous ozone is effective against pathogen loads comparable to those found in naturally contaminated fresh produce, but efficacy decreases as inoculum level increases.

  4. Reduction of verotoxigenic Escherichia coli by process and recipe optimisation in dry-fermented sausages.

    PubMed

    Heir, E; Holck, A L; Omer, M K; Alvseike, O; Høy, M; Måge, I; Axelsson, L

    2010-07-15

    Outbreaks of verotoxigenic Escherichia coli (VTEC) linked to dry-fermented sausages (DFSs) have emphasized the need for DFS manufacturers to introduce measures to obtain enhanced safety and still maintain the sensory qualities of their products. To our knowledge no data have yet been reported on non-O157:H7 VTEC survival in DFS. Here, the importance of recipe and process variables on VTEC (O157:H7 and O103:H25) reductions in two types of DFS, morr and salami, was determined through three statistically designed experiments. Linear regression and ANOVA analyses showed that no single variable had a dominant effect on VTEC reductions. High levels of NaCl, NaNO(2), glucose (low pH) and fermentation temperature gave enhanced VTEC reduction, while high fat and large casing diameter (a(w)) gave the opposite effect. Interaction effects were small. The process and recipe variables showed similar effects in morr and salami. In general, recipes combining high batter levels of salt (NaCl and NaNO(2)) and glucose along with high fermentation temperature that gave DFS with low final pH and a(w), provided approximately 3 log(10) reductions compared to approximately 1.5 log(10) reductions obtained for standard recipe DFS. Storage at 4 degrees C for 2 months provided log(10) 0.33-0.95 additional VTEC reductions and were only marginally affected by recipe type. Sensory tests revealed only small differences between the various recipes of morr and salami. By optimisation of recipe and process parameters, it is possible to obtain increased microbial safety of DFS while maintaining the sensory qualities of the sausages. 2010 Elsevier B.V. All rights reserved.

  5. Ontogenetic improvement of visual function in the medaka Oryzias latipes based on an optomotor testing system for larval and adult fish

    USGS Publications Warehouse

    Carvalho, Paulo S. M.; Noltie, Douglas B.; Tillitt, D.E.

    2002-01-01

    We developed a system for evaluation of visual function in larval and adult fish. Both optomotor (swimming) and optokinetic (eye movement) responses were monitored and recorded using a system of rotating stripes. The system allowed manipulation of factors such as width of the stripes used, rotation speed of the striped drum, and light illuminance levels within both the scotopic and photopic ranges. Precise control of these factors allowed quantitative measurements of visual acuity and motion detection. Using this apparatus, we tested the hypothesis that significant posthatch ontogenetic improvements in visual function occur in the medaka Oryzias latipes, and also that this species shows significant in ovo neuronal development. Significant improvements in the acuity angle alpha (ability to discriminate detail) were observed from approximately 5 degrees at hatch to 1 degree in the oldest adult stages. In addition, we measured a significant improvement in flicker fusion thresholds (motion detection skills) between larval and adult life stages within both the scotopic and photopic ranges of light illuminance. Ranges of flicker fusion thresholds (X±SD) at log I=1.96 (photopic) varied from 37.2±1.6 cycles/s in young adults to 18.6±1.6 cycles/s in young larvae 10 days posthatch. At log I=−2.54 (scotopic), flicker fusion thresholds varied from 5.8±0.7 cycles/s in young adults to 1.7±0.4 cycles/s in young larvae 10 days posthatch. Light sensitivity increased approximately 2.9 log units from early hatched larval stages to adults. The demonstrated ontogenetic improvements in visual function probably enable the fish to explore new resources, thereby enlarging their fundamental niche.

  6. Large ground surface temperature changes of the last three centuries inferred from borehole temperatures in the Southern Canadian Prairies, Saskatchewan

    NASA Astrophysics Data System (ADS)

    Majorowicz, Jacek A.; Safanda, Jan; Harris, Robert N.; Skinner, Walter R.

    1999-05-01

    New temperature logs in wells located in the grassland ecozone in the Southern Canadian Prairies in Saskatchewan, where surface disturbance is considered minor, show a large curvature in the upper 100 m. The character of this curvature is consistent with ground surface temperature (GST) warming in the 20th century. Repetition of precise temperature logs in southern Saskatchewan (years 1986 and 1997) shows the conductive nature of warming of the subsurface sediments. The magnitude of surface temperature change during that time (11 years) is high (0.3-0.4°C). To assess the conductive nature of temperature variations at the grassland surface interface, several precise air and soil temperature time series in the southern Canadian Prairies (1965-1995) were analyzed. The combined anomalies correlated at 0.85. Application of the functional space inversion (FSI) technique with the borehole temperature logs and site-specific lithology indicates a warming to date of approximately 2.5°C since a minimum in the late 18th century to mid 19th century. This warming represents an approximate increase from 4°C around 1850 to 6.5°C today. The significance of this record is that it suggests almost half of the warming occurred prior to 1900, before dramatic build up of atmospheric green house gases. This result correlates well with the proxy record of climatic change further to the north, beyond the Arctic Circle [Overpeck, J., Hughen, K., Hardy, D., Bradley, R., Case, R., Douglas, M., Finney, B., Gajewski, K., Jacoby, G., Jennings, A., Lamourex, S., Lasca, A., MacDonald, G., Moore, J., Retelle, M., Smith, S., Wolfe, A., Zielinski, G., 1997. Arctic environmental change of the last four centuries, Science 278, 1251-1256.].

  7. Chlorine Dioxide Inactivation of Cryptosporidium parvum Oocysts and Bacterial Spore Indicators

    PubMed Central

    Chauret, Christian P.; Radziminski, Chris Z.; Lepuil, Michael; Creason, Robin; Andrews, Robert C.

    2001-01-01

    Cryptosporidium parvum, which is resistant to chlorine concentrations typically used in water treatment, is recognized as a significant waterborne pathogen. Recent studies have demonstrated that chlorine dioxide is a more efficient disinfectant than free chlorine against Cryptosporidium oocysts. It is not known, however, if oocysts from different suppliers are equally sensitive to chlorine dioxide. This study used both a most-probable-number–cell culture infectivity assay and in vitro excystation to evaluate chlorine dioxide inactivation kinetics in laboratory water at pH 8 and 21°C. The two viability methods produced significantly different results (P < 0.05). Products of disinfectant concentration and contact time (Ct values) of 1,000 mg · min/liter were needed to inactivate approximately 0.5 log10 and 2.0 log10 units (99% inactivation) of C. parvum as measured by in vitro excystation and cell infectivity, respectively, suggesting that excystation is not an adequate viability assay. Purified oocysts originating from three different suppliers were evaluated and showed marked differences with respect to their resistance to inactivation when using chlorine dioxide. Ct values of 75, 550, and 1,000 mg · min/liter were required to achieve approximately 2.0 log10 units of inactivation with oocysts from different sources. Finally, the study compared the relationship between easily measured indicators, including Bacillus subtilis (aerobic) spores and Clostridium sporogenes (anaerobic) spores, and C. parvum oocysts. The bacterial spores were found to be more sensitive to chlorine dioxide than C. parvum oocysts and therefore could not be used as direct indicators of C. parvum inactivation for this disinfectant. In conclusion, it is suggested that future studies address issues such as oocyst purification protocols and the genetic diversity of C. parvum, since these factors might affect oocyst disinfection sensitivity. PMID:11425712

  8. The use of carboxymethylcellulose gel to increase non-viral gene transfer in mouse airways

    PubMed Central

    Griesenbach, Uta; Meng, Cuixiang; Farley, Raymond; Wasowicz, Marguerite; Munkonge, Felix M; Chan, Mario; Stoneham, Charlotte; Sumner-Jones, Stephanie; Pringle, Ian A.; Gill, Deborah R.; Hyde, Stephen C.; Stevenson, Barbara; Holder, Emma; Ban, Hiroshi; Hasegawa, Mamoru; Cheng, Seng H; Scheule, Ronald K; Sinn, Patrick L; McCray, Paul B; Alton, Eric WFW

    2014-01-01

    We have assessed whether viscoelastic gels known to inhibit mucociliary clearance can increase lipid-mediated gene transfer. Methylcellulose or carboxymethylcellulose (0.25 to 1.5%) were mixed with complexes of the cationic lipid GL67A and plasmids encoding luciferase and perfused onto the nasal epithelium of mice. Survival after perfusion with 1% CMC or1% MC was 90 and 100%, respectively. In contrast 1.5% CMC was uniformly lethal likely due to the viscous solution blocking the airways. Perfusion with 0.5% CMC containing lipid/DNA complexes reproducibly increased gene expression by approximately 3-fold (n= 16, p<0.05). Given this benefit, likely related to increased duration of contact, we also assessed the effect of prolonging contact time of the liposome/DNA complexes by delivering our standard 80 μg DNA dose over either approximately 22 or 60 min of perfusion. This independently increased gene transfer by 6-fold (n=8, p<0.05) and could be further enhanced by the addition of 0.5% CMC, leading to an overall 25-fold enhancement (n=8, p<0.001) in gene expression. As a result of these interventions CFTR transgene mRNA transgene levels were increased several logs above background. Interestingly, this did not lead to correction of the ion transport defects in the nasal epithelium of cystic fibrosis mice nor for immunohistochemical quantification of CFTR expression. To assess if 0.5% CMC also increased gene transfer in the mouse lung, we used whole body nebulisation chambers. CMC was nebulised for 1 hr immediately before, or simultaneously with GL67A/pCIKLux. The former did not increase gene transfer, whereas co-administration significantly increased gene transfer by 4-fold (p<0.0001, n=18). This study suggests that contact time of non-viral gene transfer agents is a key factor for gene delivery, and suggests two methods which may be translatable for use in man. PMID:20022367

  9. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  10. Characterization of the Hydrocarbon Potential and Non-Potential Zones Using Wavelet-Based Fractal Analysis

    NASA Astrophysics Data System (ADS)

    Mukherjee, Bappa; Roy, P. N. S.

    The identification of prospective and dry zone is of major importance from well log data. Truthfulness in the identification of potential zone is a very crucial issue in hydrocarbon exploration. In this line, the problem has received considerable attention and many conventional techniques have been proposed. The purpose of this study is to recognize the hydrocarbon and non-hydrocarbon bearing portion within a reservoir by using the non-conventional technique. The wavelet based fractal analysis (WBFA) has been applied on the wire-line log data in order to obtain the pre-defined hydrocarbon (HC) and non-hydrocarbon (NHC) zones by their self-affine signal nature is demonstrated in this paper. The feasibility of the proposed technique is tested with the help of most commonly used logs, like self-potential, gamma ray, resistivity and porosity log responses. These logs are obtained from the industry to make out several HC and NHC zones of all wells in the study region belonging to the upper Assam basin. The results obtained in this study for a particular log response, where in the case of HC bearing zones, it is found that they are mainly situated in a variety of sandstones lithology which leads to the higher Hurst exponent. Further, the NHC zones found to be analogous to lithology with higher shale content having lower Hurst exponent. The above proposed technique can overcome the chance of miss interpretation in conventional reservoir characterization.

  11. Algorithms in Discrepancy Theory and Lattices

    NASA Astrophysics Data System (ADS)

    Ramadas, Harishchandra

    This thesis deals with algorithmic problems in discrepancy theory and lattices, and is based on two projects I worked on while at the University of Washington in Seattle. A brief overview is provided in Chapter 1 (Introduction). Chapter 2 covers joint work with Avi Levy and Thomas Rothvoss in the field of discrepancy minimization. A well-known theorem of Spencer shows that any set system with n sets over n elements admits a coloring of discrepancy O(√n). While the original proof was non-constructive, recent progress brought polynomial time algorithms by Bansal, Lovett and Meka, and Rothvoss. All those algorithms are randomized, even though Bansal's algorithm admitted a complicated derandomization. We propose an elegant deterministic polynomial time algorithm that is inspired by Lovett-Meka as well as the Multiplicative Weight Update method. The algorithm iteratively updates a fractional coloring while controlling the exponential weights that are assigned to the set constraints. A conjecture by Meka suggests that Spencer's bound can be generalized to symmetric matrices. We prove that n x n matrices that are block diagonal with block size q admit a coloring of discrepancy O(√n . √log(q)). Bansal, Dadush and Garg recently gave a randomized algorithm to find a vector x with entries in {-1,1} with ∥Ax∥infinity ≤ O(√log n) in polynomial time, where A is any matrix whose columns have length at most 1. We show that our method can be used to deterministically obtain such a vector. In Chapter 3, we discuss a result in the broad area of lattices and integer optimization, in joint work with Rebecca Hoberg, Thomas Rothvoss and Xin Yang. The number balancing (NBP) problem is the following: given real numbers a1,...,an in [0,1], find two disjoint subsets I1,I2 of [ n] so that the difference |sumi∈I1a i - sumi∈I2ai| of their sums is minimized. An application of the pigeonhole principle shows that there is always a solution where the difference is at most O √n/2n). Finding the minimum, however, is NP-hard. In polynomial time, the differencing algorithm by Karmarkar and Karp from 1982 can produce a solution with difference at most n-theta(log n), but no further improvement has been made since then. We show a relationship between NBP and Minkowski's Theorem. First we show that an approximate oracle for Minkowski's Theorem gives an approximate NBP oracle. Perhaps more surprisingly, we show that an approximate NBP oracle gives an approximate Minkowski oracle. In particular, we prove that any polynomial time algorithm that guarantees a solution of difference at most 2√n/2 n would give a polynomial approximation for Minkowski as well as a polynomial factor approximation algorithm for the Shortest Vector Problem.

  12. How does abundance scale with body size in coupled size-structured food webs?

    PubMed

    Blanchard, Julia L; Jennings, Simon; Law, Richard; Castle, Matthew D; McCloghrie, Paul; Rochet, Marie-Joëlle; Benoît, Eric

    2009-01-01

    1. Widely observed macro-ecological patterns in log abundance vs. log body mass of organisms can be explained by simple scaling theory based on food (energy) availability across a spectrum of body sizes. The theory predicts that when food availability falls with body size (as in most aquatic food webs where larger predators eat smaller prey), the scaling between log N vs. log m is steeper than when organisms of different sizes compete for a shared unstructured resource (e.g. autotrophs, herbivores and detritivores; hereafter dubbed 'detritivores'). 2. In real communities, the mix of feeding characteristics gives rise to complex food webs. Such complexities make empirical tests of scaling predictions prone to error if: (i) the data are not disaggregated in accordance with the assumptions of the theory being tested, or (ii) the theory does not account for all of the trophic interactions within and across the communities sampled. 3. We disaggregated whole community data collected in the North Sea into predator and detritivore components and report slopes of log abundance vs. log body mass relationships. Observed slopes for fish and epifaunal predator communities (-1.2 to -2.25) were significantly steeper than those for infaunal detritivore communities (-0.56 to -0.87). 4. We present a model describing the dynamics of coupled size spectra, to explain how coupling of predator and detritivore communities affects the scaling of log N vs. log m. The model captures the trophic interactions and recycling of material that occur in many aquatic ecosystems. 5. Our simulations demonstrate that the biological processes underlying growth and mortality in the two distinct size spectra lead to patterns consistent with data. Slopes of log N vs. log m were steeper and growth rates faster for predators compared to detritivores. Size spectra were truncated when primary production was too low for predators and when detritivores experienced predation pressure. 6. The approach also allows us to assess the effects of external sources of mortality (e.g. harvesting). Removal of large predators resulted in steeper predator spectra and increases in their prey (small fish and detritivores). The model predictions are remarkably consistent with observed patterns of exploited ecosystems.

  13. Mineral content prediction for unconventional oil and gas reservoirs based on logging data

    NASA Astrophysics Data System (ADS)

    Maojin, Tan; Youlong, Zou; Guoyue

    2012-09-01

    Coal bed methane and shale oil &gas are both important unconventional oil and gas resources, whose reservoirs are typical non-linear with complex and various mineral components, and the logging data interpretation model are difficult to establish for calculate the mineral contents, and the empirical formula cannot be constructed due to various mineral. The radial basis function (RBF) network analysis is a new method developed in recent years; the technique can generate smooth continuous function of several variables to approximate the unknown forward model. Firstly, the basic principles of the RBF is discussed including net construct and base function, and the network training is given in detail the adjacent clustering algorithm specific process. Multi-mineral content for coal bed methane and shale oil &gas, using the RBF interpolation method to achieve a number of well logging data to predict the mineral component contents; then, for coal-bed methane reservoir parameters prediction, the RBF method is used to realized some mineral contents calculation such as ash, volatile matter, carbon content, which achieves a mapping from various logging data to multimineral. To shale gas reservoirs, the RBF method can be used to predict the clay content, quartz content, feldspar content, carbonate content and pyrite content. Various tests in coalbed and gas shale show the method is effective and applicable for mineral component contents prediction

  14. Comparison of the disinfection efficacy of chlorine-based products for inactivation of viral indicators and pathogenic bacteria in produce wash water.

    PubMed

    Chaidez, Cristobal; Moreno, Maria; Rubio, Werner; Angulo, Miguel; Valdez, Benigno

    2003-09-01

    Outbreaks of pathogenic bacteria infections associated with the consumption of fresh produce has occurred with increased frequency in recent years. This study was undertaken to determine the efficacy of three commonly used disinfectants in packing-houses of Culiacan, Mexico (sodium hypochlorite [NaOCl], trichlor-s-triazinetrione [TST] and thrichlormelamine [TCM]) for inactivation of viral indicators and pathogenic bacteria inoculated onto produce wash water. Each microbial challenge consisted of 2 L of water containing approximately 8 log10 bacterial CFU ml(-1), and 8 log10 viral PFU ml(-1) treated with 100 and 300 mg l(-1) of total chlorine with modified turbidity. Water samples were taken after 2 min of contact with chlorine-based products and assayed for the particular microorganisms. TST and NaOCl were found to effectively reduce for bacterial pathogens and viral indicators 8 log10 and 7 log10, respectively (alpha=0.05). The highest inactivation rate was observed when the turbidity was low and the disinfectant was applied at 300 mg l(-1). TCM did not show effective results when compared with the TST and NaOCl (P<0.05). These findings suggest that turbidity created by the organic and inorganic material present in the water tanks carried by the fresh produce may affect the efficacy of the chlorine-based products.

  15. Geophysical characterization of the Lollie Levee near Conway, Arkansas, using capacitively coupled resistivity, coring, and direct push logging

    USGS Publications Warehouse

    Gillip, Jonathan A.; Payne, Jason

    2011-01-01

    A geophysical characterization of Lollie Levee near Conway, Arkansas, was conducted in February 2011. A capacitively coupled resistivity survey (using Geometric's OhmMapper) was completed along the top and toe of the 6.7-mile levee. Two-dimensional inversions were conducted on the geophysical data. As a quality-control measure, cores and direct push logs were taken at approximately 1-mile intervals along the levee. The capacitively coupled resistivity survey, the coring, and the direct push logs were used to characterize the geologic materials. Comparison of the cores and the direct push log data, along with published resistivity values, indicates that resistivity values of 200 Ohm-meters or greater represent relatively clean sand, with decreasing resistivity values occurring with increasing silt and clay content. The cores indicated that the levee is composed of a heterogeneous mixture of sand, silt, and clay. The capacitively coupled resistivity sections confirm that the levee is composed of a heterogeneous mixture of high and low resistivity materials and show that the composition of the levee varies spatially. The geologic materials underlying the levee vary spatially as a result of the geologic processes that deposited them. In general, the naturally deposited geologic materials underlying the levee contain a greater amount of low resistivity materials in the southern extent of the levee.

  16. Cold air plasma to decontaminate inanimate surfaces of the hospital environment.

    PubMed

    Cahill, Orla J; Claro, Tânia; O'Connor, Niall; Cafolla, Anthony A; Stevens, Niall T; Daniels, Stephen; Humphreys, Hilary

    2014-03-01

    The hospital environment harbors bacteria that may cause health care-associated infections. Microorganisms, such as multiresistant bacteria, can spread around the patient's inanimate environment. Some recently introduced biodecontamination approaches in hospitals have significant limitations due to the toxic nature of the gases and the length of time required for aeration. This study evaluated the in vitro use of cold air plasma as an efficient alternative to traditional methods of biodecontamination of hospital surfaces. Cultures of methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), extended-spectrum-β-lactamase (ESBL)-producing Escherichia coli, and Acinetobacter baumannii were applied to different materials similar to those found in the hospital environment. Artificially contaminated sections of marmoleum, mattress, polypropylene, powder-coated mild steel, and stainless steel were then exposed to a cold air pressure plasma single jet for 30 s, 60 s, and 90 s, operating at approximately 25 W and 12 liters/min flow rate. Direct plasma exposure successfully reduced the bacterial load by log 3 for MRSA, log 2.7 for VRE, log 2 for ESBL-producing E. coli, and log 1.7 for A. baumannii. The present report confirms the efficient antibacterial activity of a cold air plasma single-jet plume on nosocomial bacterially contaminated surfaces over a short period of time and highlights its potential for routine biodecontamination in the clinical environment.

  17. A leading edge heating array and a flat surface heating array - operation, maintenance and repair manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A general description of the leading edge/flat surface heating array is presented along with its components, assembly instructions, installation instructions, operation procedures, maintenance instructions, repair procedures, schematics, spare parts lists, engineering drawings of the array, and functional acceptance test log sheets. The proper replacement of components, correct torque values, step-by-step maintenance instructions, and pretest checkouts are described.

  18. Correlation and persistence of hunting and logging impacts on tropical rainforest mammals.

    PubMed

    Brodie, Jedediah F; Giordano, Anthony J; Zipkin, Elise F; Bernard, Henry; Mohd-Azlan, Jayasilan; Ambu, Laurentius

    2015-02-01

    Humans influence tropical rainforest animals directly via exploitation and indirectly via habitat disturbance. Bushmeat hunting and logging occur extensively in tropical forests and have large effects on particular species. But how they alter animal diversity across landscape scales and whether their impacts are correlated across species remain less known. We used spatially widespread measurements of mammal occurrence across Malaysian Borneo and recently developed multispecies hierarchical models to assess the species richness of medium- to large-bodied terrestrial mammals while accounting for imperfect detection of all species. Hunting was associated with 31% lower species richness. Moreover, hunting remained high even where richness was very low, highlighting that hunting pressure persisted even in chronically overhunted areas. Newly logged sites had 11% lower species richness than unlogged sites, but sites logged >10 years previously had richness levels similar to those in old-growth forest. Hunting was a more serious long-term threat than logging for 91% of primate and ungulate species. Hunting and logging impacts across species were not correlated across taxa. Negative impacts of hunting were the greatest for common mammalian species, but commonness versus rarity was not related to species-specific impacts of logging. Direct human impacts appeared highly persistent and lead to defaunation of certain areas. These impacts were particularly severe for species of ecological importance as seed dispersers and herbivores. Indirect impacts were also strong but appeared to attenuate more rapidly than previously thought. The lack of correlation between direct and indirect impacts across species highlights that multifaceted conservation strategies may be needed for mammal conservation in tropical rainforests, Earth's most biodiverse ecosystems. © 2014 Society for Conservation Biology.

  19. Enhancement of photodynamic inactivation of Staphylococcus aureus biofilms by disruptive strategies.

    PubMed

    Gándara, Lautaro; Mamone, Leandro; Bohm, Gabriela Cervini; Buzzola, Fernanda; Casas, Adriana

    2017-11-01

    Photodynamic inactivation (PDI) has been used to inactivate microorganisms through the use of photosensitizers and visible light. On the one hand, near-infrared treatment (NIRT) has also bactericidal and dispersal effects on biofilms. In addition, dispersal biological tools such as enzymes have also been employed in antibiotic combination treatments. The aim of this work was to use alternative approaches to increase the PDI efficacy, employing combination therapies aimed at the partial disruption of the biofilms, thus potentially increasing photosensitizer or oxygen penetration and interaction with bacteria. To that end, we applied toluidine blue (TB)-PDI treatment to Staphylococcus aureus biofilms previously treated with NIRT or enzymes and investigated the outcome of the combined therapies. TB employed at 0.5 mM induced per se 2-log drop in S. aureus RN6390 biofilm viability. Each NIRT (980-nm laser) and PDI (635-nm laser) treatment induced a further reduction of 1-log of viable counts. The combination of successive 980- and 635-nm laser treatments on TB-treated biofilms induced additive effects, leading to a 4.5-log viable count decrease. Proteinase K treatment applied to S. aureus of the Newman strain induced an additive effect on PDI mortality, leading to an overall 4-log decrease in S. aureus viability. Confocal scanning laser microscopy after biofilm staining with a fluorescent viability test and scanning electron microscopy observations were correlated with colony counts. The NIRT dose employed (227 J/cm 2 ) led to an increase from 21 to 47 °C in the buffer temperature of the biofilm system, and this NIRT dose also induced 100% keratinocyte death. Further work is needed to establish conditions under which biofilm dispersal occurs at lower NIRT doses.

  20. Extending the Fellegi-Sunter probabilistic record linkage method for approximate field comparators.

    PubMed

    DuVall, Scott L; Kerber, Richard A; Thomas, Alun

    2010-02-01

    Probabilistic record linkage is a method commonly used to determine whether demographic records refer to the same person. The Fellegi-Sunter method is a probabilistic approach that uses field weights based on log likelihood ratios to determine record similarity. This paper introduces an extension of the Fellegi-Sunter method that incorporates approximate field comparators in the calculation of field weights. The data warehouse of a large academic medical center was used as a case study. The approximate comparator extension was compared with the Fellegi-Sunter method in its ability to find duplicate records previously identified in the data warehouse using different demographic fields and matching cutoffs. The approximate comparator extension misclassified 25% fewer pairs and had a larger Welch's T statistic than the Fellegi-Sunter method for all field sets and matching cutoffs. The accuracy gain provided by the approximate comparator extension grew as less information was provided and as the matching cutoff increased. Given the ubiquity of linkage in both clinical and research settings, the incremental improvement of the extension has the potential to make a considerable impact.

  1. Maximizing Submodular Functions under Matroid Constraints by Evolutionary Algorithms.

    PubMed

    Friedrich, Tobias; Neumann, Frank

    2015-01-01

    Many combinatorial optimization problems have underlying goal functions that are submodular. The classical goal is to find a good solution for a given submodular function f under a given set of constraints. In this paper, we investigate the runtime of a simple single objective evolutionary algorithm called (1 + 1) EA and a multiobjective evolutionary algorithm called GSEMO until they have obtained a good approximation for submodular functions. For the case of monotone submodular functions and uniform cardinality constraints, we show that the GSEMO achieves a (1 - 1/e)-approximation in expected polynomial time. For the case of monotone functions where the constraints are given by the intersection of K ≥ 2 matroids, we show that the (1 + 1) EA achieves a (1/k + δ)-approximation in expected polynomial time for any constant δ > 0. Turning to nonmonotone symmetric submodular functions with k ≥ 1 matroid intersection constraints, we show that the GSEMO achieves a 1/((k + 2)(1 + ε))-approximation in expected time O(n(k + 6)log(n)/ε.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevast'yanov, E A; Sadekova, E Kh

    The Bulgarian mathematicians Sendov, Popov, and Boyanov have well-known results on the asymptotic behaviour of the least deviations of 2{pi}-periodic functions in the classes H{sup {omega}} from trigonometric polynomials in the Hausdorff metric. However, the asymptotics they give are not adequate to detect a difference in, for example, the rate of approximation of functions f whose moduli of continuity {omega}(f;{delta}) differ by factors of the form (log(1/{delta})){sup {beta}}. Furthermore, a more detailed determination of the asymptotic behaviour by traditional methods becomes very difficult. This paper develops an approach based on using trigonometric snakes as approximating polynomials. The snakes of ordermore » n inscribed in the Minkowski {delta}-neighbourhood of the graph of the approximated function f provide, in a number of cases, the best approximation for f (for the appropriate choice of {delta}). The choice of {delta} depends on n and f and is based on constructing polynomial kernels adjusted to the Hausdorff metric and polynomials with special oscillatory properties. Bibliography: 19 titles.« less

  3. Description and correlation of reservoir heterogenity within the Big Injun sandstone, Granny Creek field, West Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vargo, A.; McDowell, R.; Matchen, D.

    1992-01-01

    The Granny Creek field (approximately 6 sq. miles in area), located in Clay and Roane counties, West Virginia, produces oil from the Big Injun sandstone (Lower Mississippian). Analysis of 15 cores, 22 core analyses, and approximately 400 wireline logs (gamma ray and bulk density) show that the Big Injun (approximately 12 to 55 feet thick) can be separated into an upper, coarse-grained sandstone and a lower, fine-grained sandstone. The Big Injun is truncated by an erosional unconformity of Early to Middle Mississippian age which removes the coarse-grain upper unit in the northwest portion of the field. The cores show nodulesmore » and zones (1 inch to 6 feet thick) of calcite and siderite cement. Where the cements occur as zones, porosity and permeability are reduced. Thin shales (1 inch to 1 foot thick) are found in the coarse-grained member of the Big Injun, whereas the bottom of the fine-grained, lower member contains intertongues of dark shale which cause pinchouts in porosity at the bottom of the reservoir. Calcite and siderite cement are recognized on wireline logs as high bulk density zones that form horizontal, inclined, and irregular pods of impermeable sandstone. At a 400 foot well spacing, pods may be confined to a single well or encompass as many as 30 wells creating linear and irregular barriers to flow. These pods increase the length of the fluid flow path and may divide the reservoir into discrete compartments. The combination of sedimentologic and diagenetic features contribute to the heterogeneity observed in the field.« less

  4. Alternative Antimicrobial Commercial Egg Washing Procedures.

    PubMed

    Hudson, Lauren K; Harrison, Mark A; Berrang, Mark E; Jones, Deana R

    2016-07-01

    Commercial table eggs are washed prior to packaging. Standard wash procedures use an alkaline pH and warm water. If a cool water method could be developed that would still provide a microbiologically safe egg, the industry may save energy costs associated with water heating. Four wash procedures were evaluated for Salmonella reduction: pH 11 at 48.9°C (industry standard), pH 11 at ambient temperature (∼20°C), pH 6 at 48.9°C, and pH 6 at ambient temperature. Alkaline washes contained potassium hydroxide-based detergent, while pH 6 washes contained approximately 200 ppm of chlorine and a proprietary chlorine stabilizer (T-128). When eggs were inoculated by immersion in a cell suspension of Salmonella Enteritidis and Salmonella Typhimurium, all treatments resulted in a slight and similar reduction of Salmonella numbers (approximately 0.77 log CFU/ml of shell emulsion reduction). When eggs were inoculated by droplet on the shell surface, Salmonella counts were reduced by approximately 5 log CFU when washed with chlorine plus the chlorine stabilizer at both temperatures and with the alkaline wash at the high temperature. The reductions in Salmonella by these treatments were not significantly (P > 0.05) different from each other but were significantly (P < 0.05) more than the reduction observed for the 20°C alkaline treatment and 20°C control water treatments. Ambient temperature acidic washes reduced Salmonella contamination to the same degree as the standard pH 11 warm water wash and may be a viable option to reduce cost, increase shelf life, and slow pathogen growth in and on shell eggs.

  5. New Insights into the Consequences of Post-Windthrow Salvage Logging Revealed by Functional Structure of Saproxylic Beetles Assemblages

    PubMed Central

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests. PMID:25050914

  6. New insights into the consequences of post-windthrow salvage logging revealed by functional structure of saproxylic beetles assemblages.

    PubMed

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests.

  7. Meta-analysis of the Effects of Sanitizing Treatments on Salmonella, Escherichia coli O157:H7, and Listeria monocytogenes Inactivation in Fresh Produce

    PubMed Central

    Prado-Silva, Leonardo; Cadavez, Vasco; Gonzales-Barron, Ursula; Rezende, Ana Carolina B.

    2015-01-01

    The aim of this study was to perform a meta-analysis of the effects of sanitizing treatments of fresh produce on Salmonella spp., Escherichia coli O157:H7, and Listeria monocytogenes. From 55 primary studies found to report on such effects, 40 were selected based on specific criteria, leading to more than 1,000 data on mean log reductions of these three bacterial pathogens impairing the safety of fresh produce. Data were partitioned to build three meta-analytical models that could allow the assessment of differences in mean log reductions among pathogens, fresh produce, and sanitizers. Moderating variables assessed in the meta-analytical models included type of fresh produce, type of sanitizer, concentration, and treatment time and temperature. Further, a proposal was done to classify the sanitizers according to bactericidal efficacy by means of a meta-analytical dendrogram. The results indicated that both time and temperature significantly affected the mean log reductions of the sanitizing treatment (P < 0.0001). In general, sanitizer treatments led to lower mean log reductions when applied to leafy greens (for example, 0.68 log reductions [0.00 to 1.37] achieved in lettuce) compared to other, nonleafy vegetables (for example, 3.04 mean log reductions [2.32 to 3.76] obtained for carrots). Among the pathogens, E. coli O157:H7 was more resistant to ozone (1.6 mean log reductions), while L. monocytogenes and Salmonella presented high resistance to organic acids, such as citric acid, acetic acid, and lactic acid (∼3.0 mean log reductions). With regard to the sanitizers, it has been found that slightly acidic electrolyzed water, acidified sodium chlorite, and the gaseous chlorine dioxide clustered together, indicating that they possessed the strongest bactericidal effect. The results reported seem to be an important achievement for advancing the global understanding of the effectiveness of sanitizers for microbial safety of fresh produce. PMID:26362982

  8. Vodka and Violence: Alcohol Consumption and Homicide Rates in Russia

    PubMed Central

    Pridemore, William Alex

    2002-01-01

    In Russia, rates of alcohol consumption and homicide are among the highest in the world, and already-high levels increased dramatically after the breakup of the Soviet Union. Rates of both, however, vary greatly among Russia’s 89 regions. We took advantage of newly available vital statistics and socioeconomic data to examine the regional covariation of drinking and lethal violence. Log-log models were employed to estimate the impact of alcohol consumption on regional homicide rates, controlling for structural factors thought to influence the spatial distribution of homicide rates. Results revealed a positive and significant relationship between alcohol consumption and homicide, with a 1% increase in regional consumption of alcohol associated with an approximately 0.25% increase in homicide rates. In Russia, higher regional rates of alcohol consumption are associated with higher rates of homicide. PMID:12453810

  9. Design of LabVIEW®-based software for the control of sequential injection analysis instrumentation for the determination of morphine

    PubMed Central

    Lenehan, Claire E.; Lewis, Simon W.

    2002-01-01

    LabVIEW®-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 × 10-10 to 5 × 10-6 M) with a line of best fit of y=1.05x+8.9164 (R2 =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 × 10-8 M). The limit of detection (3σ) was determined as 5 × 10-11 M morphine. PMID:18924729

  10. Design of LabVIEW-based software for the control of sequential injection analysis instrumentation for the determination of morphine.

    PubMed

    Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W

    2002-01-01

    LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.

  11. Vodka and violence: alcohol consumption and homicide rates in Russia.

    PubMed

    Pridemore, William Alex

    2002-12-01

    In Russia, rates of alcohol consumption and homicide are among the highest in the world, and already-high levels increased dramatically after the breakup of the Soviet Union. Rates of both, however, vary greatly among Russia's 89 regions. We took advantage of newly available vital statistics and socioeconomic data to examine the regional covariation of drinking and lethal violence. Log-log models were employed to estimate the impact of alcohol consumption on regional homicide rates, controlling for structural factors thought to influence the spatial distribution of homicide rates. Results revealed a positive and significant relationship between alcohol consumption and homicide, with a 1% increase in regional consumption of alcohol associated with an approximately 0.25% increase in homicide rates. In Russia, higher regional rates of alcohol consumption are associated with higher rates of homicide.

  12. Titanium dioxide/UV photocatalytic disinfection in fresh carrots.

    PubMed

    Cho, Mihee; Choi, Yoonjung; Park, Hyojin; Kim, Kwansik; Woo, Gun-Jo; Park, Jiyong

    2007-01-01

    Increased occurrences of fresh produce-related outbreaks of foodborne illness have focused attention on effective washing processes for fruits and vegetables. A titanium dioxide (TiO2) photocatalytic reaction under UV radiation provides a high rate of disinfection. The photo-killing effects of TiO2 on bacteria in liquid cultures under experimental conditions have been widely studied. However, the disinfection effects of the TiO2 photocatalytic reaction on fresh vegetables during a washing process have not been evaluated. Our objectives were to design a pilot-scale TiO2/UV photocatalytic reactor for fresh carrots and to compare the bactericidal effects of the TiO2/UV reaction against bacteria in liquid media and on carrots. TiO2/UV photocatalytic reactions for 40, 60, and 30 s were required for the complete killing of Escherichia coli, Salmonella Typhimurium, and Bacillus cereus (initial counts of approximately 6.7 log CFU/ml), respectively. The counts of total aerobic bacteria in fresh carrots and foodborne pathogenic bacteria in inoculated carrots were also measured. Counts of total aerobic bacteria were reduced by 1.8 log CFU/g after TiO2/UV photocatalytic disinfection for 20 min compared with a 1.1-log CFU/g reduction by UV alone. E. coli, Salmonella Typhimurium, and B. cereus (8 log CFU/ml) were inoculated onto carrots, and the number of surviving bacteria in carrots was determined after treatment. The TiO2/UV treatment exhibited 2.1-, 2.3-, and 1.8-log CFU/g reductions in the counts of E. coli, Salmonella Typhimurium, and B. cereus, respectively, compared with 1.3-, 1.2-, and 1.2-log CFU/g reductions by UV alone. The TiO2/UV photocatalyst reaction showed significant bactericidal effects, indicating that this process is applicable to nonthermal disinfection of fresh vegetables.

  13. Three-dimensional trend mapping from wire-line logs

    USGS Publications Warehouse

    Doveton, J.H.; Ke-an, Z.

    1985-01-01

    Mapping of lithofacies and porosities of stratigraphic units is complicated because these properties vary in three dimensions. The method of moments was proposed by Krumbein and Libby (1957) as a technique to aid in resolving this problem. Moments are easily computed from wireline logs and are simple statistics which summarize vertical variation in a log trace. Combinations of moment maps have proved useful in understanding vertical and lateral changes in lithology of sedimentary rock units. Although moments have meaning both as statistical descriptors and as mechanical properties, they also define polynomial curves which approximate lithologic changes as a function of depth. These polynomials can be fitted by least-squares methods, partitioning major trends in rock properties from finescale fluctuations. Analysis of variance yields the degree of fit of any polynomial and measures the proportion of vertical variability expressed by any moment or combination of moments. In addition, polynomial curves can be differentiated to determine depths at which pronounced expressions of facies occur and to determine the locations of boundaries between major lithologic subdivisions. Moments can be estimated at any location in an area by interpolating from log moments at control wells. A matrix algebra operation then converts moment estimates to coefficients of a polynomial function which describes a continuous curve of lithologic variation with depth. If this procedure is applied to a grid of geographic locations, the result is a model of variability in three dimensions. Resolution of the model is determined largely by number of moments used in its generation. The method is illustrated with an analysis of lithofacies in the Simpson Group of south-central Kansas; the three-dimensional model is shown as cross sections and slice maps. In this study, the gamma-ray log is used as a measure of shaliness of the unit. However, the method is general and can be applied, for example, to suites of neutron, density, or sonic logs to produce three-dimensional models of porosity in reservoir rocks. ?? 1985 Plenum Publishing Corporation.

  14. Influence of the Natural Microbial Flora on the Acid Tolerance Response of Listeria monocytogenes in a Model System of Fresh Meat Decontamination Fluids

    PubMed Central

    Samelis, John; Sofos, John N.; Kendall, Patricia A.; Smith, Gary C.

    2001-01-01

    Depending on its composition and metabolic activity, the natural flora that may be established in a meat plant environment can affect the survival, growth, and acid tolerance response (ATR) of bacterial pathogens present in the same niche. To investigate this hypothesis, changes in populations and ATR of inoculated (105 CFU/ml) Listeria monocytogenes were evaluated at 35°C in water (10 or 85°C) or acidic (2% lactic or acetic acid) washings of beef with or without prior filter sterilization. The model experiments were performed at 35°C rather than lower (≤15°C) temperatures to maximize the response of inoculated L. monocytogenes in the washings with or without competitive flora. Acid solution washings were free (<1.0 log CFU/ml) of natural flora before inoculation (day 0), and no microbial growth occurred during storage (35°C, 8 days). Inoculated L. monocytogenes died off (negative enrichment) in acid washings within 24 h. In nonacid (water) washings, the pathogen increased (approximately 1.0 to 2.0 log CFU/ml), irrespective of natural flora, which, when present, predominated (>8.0 log CFU/ml) by day 1. The pH of inoculated water washings decreased or increased depending on absence or presence of natural flora, respectively. These microbial and pH changes modulated the ATR of L. monocytogenes at 35°C. In filter-sterilized water washings, inoculated L. monocytogenes increased its ATR by at least 1.0 log CFU/ml from days 1 to 8, while in unfiltered water washings the pathogen was acid tolerant at day 1 (0.3 to 1.4 log CFU/ml reduction) and became acid sensitive (3.0 to >5.0 log CFU/ml reduction) at day 8. These results suggest that the predominant gram-negative flora of an aerobic fresh meat plant environment may sensitize bacterial pathogens to acid. PMID:11375145

  15. Quantification and characterization of microbial biofilm community attached on the surface of fermentation vessels used in green table olive processing.

    PubMed

    Grounta, Athena; Doulgeraki, Agapi I; Panagou, Efstathios Z

    2015-06-16

    The aim of the present study was the quantification of biofilm formed on the surface of plastic vessels used in Spanish-style green olive fermentation and the characterization of the biofilm community by means of molecular fingerprinting. Fermentation vessels previously used in green olive processing were subjected to sampling at three different locations, two on the side and one on the bottom of the vessel. Prior to sampling, two cleaning treatments were applied to the containers, including (a) washing with hot tap water (60 °C) and household detergent (treatment A) and (b) washing with hot tap water, household detergent and bleach (treatment B). Population (expressed as log CFU/cm(2)) of total viable counts (TVC), lactic acid bacteria (LAB) and yeasts were enumerated by standard plating. Bulk cells (whole colonies) from agar plates were isolated for further characterization by PCR-DGGE. Results showed that regardless of the cleaning treatment no significant differences were observed between the different sampling locations in the vessel. The initial microbial population before cleaning ranged between 3.0-4.5 log CFU/cm(2) for LAB and 4.0-4.6 log CFU/cm(2) for yeasts. Cleaning treatments exhibited the highest effect on LAB that were recovered at 1.5 log CFU/cm(2) after treatment A and 0.2 log CFU/cm(2) after treatment B, whereas yeasts were recovered at approximately 1.9 log CFU/cm(2) even after treatment B. High diversity of yeasts was observed between the different treatments and sampling spots. The most abundant species recovered belonged to Candida genus, while Wickerhamomyces anomalus, Debaryomyces hansenii and Pichia guilliermondii were frequently detected. Among LAB, Lactobacillus pentosus was the most abundant species present on the abiotic surface of the vessels. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Thermal Inactivation of Listeria monocytogenes and Salmonella during Water and Steam Blanching of Vegetables.

    PubMed

    Ceylan, Erdogan; McMahon, Wendy; Garren, Donna M

    2017-09-01

    Thermal inactivation of Listeria monocytogenes and Salmonella was evaluated on peas, spinach, broccoli, potatoes, and carrots that were treated with hot water and steam. One gram-positive bacterium, L. monocytogenes, and one gram-negative bacterium, Salmonella, were selected as pertinent human pathogens for evaluation. Samples were inoculated with a composite of five strains each of L. monocytogenes and Salmonella to achieve approximately 10 8 to 10 9 CFU/g. Inoculated samples were treated with hot water at 85 and 87.8°C and with steam at 85 and 96.7°C for up to 3.5 min. A greater than 5-log reduction of L. monocytogenes and Salmonella was achieved on all products within 0.5 min by hot water blanching at 85 and 87.8°C. Steam blanching at 85°C reduced Salmonella populations by greater than 5 log on spinach and peas within 2 min and on carrots and broccoli within 3.5 min. Populations of Salmonella were reduced by more than 5 log within 1 min on carrot, spinach, and broccoli and within 2 min on peas by steam blanching at 96.7°C. Steam blanching at 85°C reduced L. monocytogenes populations by more than 5 log on carrots and spinach within 2 min and on broccoli and peas within 3.5 min. L. monocytogenes populations were reduced more than 5 log within 1 min on carrot, spinach, peas and broccoli by steam blanching at 96.7°C. Longer treatment times and higher temperatures were required for steam-blanched samples than for samples blanched with hot water. Results suggest that hot water and steam blanching practices commonly used by the frozen vegetable industry will achieve the desired 5-log lethality of L. monocytogenes and Salmonella and will enhance microbiological safety prior to freezing.

  17. LBA-ECO TG-07 Ground-based Biometry Data at km 83 Site, TapajosNational Forest: 1997

    Treesearch

    M.M. Keller; M.W. Palace

    2009-01-01

    A field inventory of trees was conducted in March of 1997 in a logging concession at the Tapajos National Forest, south of Santarem, Para, Brazil. The inventory was conducted by the foresters and technicians of the Tropical Forest Foundation (FFT) and included all trees with diameter at breast height greater than or equal to 35 cm. Four blocks of approximately 100 ha...

  18. Measurement and Analysis of Failures in Computer Systems

    NASA Technical Reports Server (NTRS)

    Thakur, Anshuman

    1997-01-01

    This thesis presents a study of software failures spanning several different releases of Tandem's NonStop-UX operating system running on Tandem Integrity S2(TMR) systems. NonStop-UX is based on UNIX System V and is fully compliant with industry standards, such as the X/Open Portability Guide, the IEEE POSIX standards, and the System V Interface Definition (SVID) extensions. In addition to providing a general UNIX interface to the hardware, the operating system has built-in recovery mechanisms and audit routines that check the consistency of the kernel data structures. The analysis is based on data on software failures and repairs collected from Tandem's product report (TPR) logs for a period exceeding three years. A TPR log is created when a customer or an internal developer observes a failure in a Tandem Integrity system. This study concentrates primarily on those TPRs that report a UNIX panic that subsequently crashes the system. Approximately 200 of the TPRs fall into this category. Approximately 50% of the failures reported are from field systems, and the rest are from the testing and development sites. It has been observed by Tandem developers that fewer cases are encountered from the field than from the test centers. Thus, the data selection mechanism has introduced a slight skew.

  19. Cortical circuitry implementing graphical models.

    PubMed

    Litvak, Shai; Ullman, Shimon

    2009-11-01

    In this letter, we develop and simulate a large-scale network of spiking neurons that approximates the inference computations performed by graphical models. Unlike previous related schemes, which used sum and product operations in either the log or linear domains, the current model uses an inference scheme based on the sum and maximization operations in the log domain. Simulations show that using these operations, a large-scale circuit, which combines populations of spiking neurons as basic building blocks, is capable of finding close approximations to the full mathematical computations performed by graphical models within a few hundred milliseconds. The circuit is general in the sense that it can be wired for any graph structure, it supports multistate variables, and it uses standard leaky integrate-and-fire neuronal units. Following previous work, which proposed relations between graphical models and the large-scale cortical anatomy, we focus on the cortical microcircuitry and propose how anatomical and physiological aspects of the local circuitry may map onto elements of the graphical model implementation. We discuss in particular the roles of three major types of inhibitory neurons (small fast-spiking basket cells, large layer 2/3 basket cells, and double-bouquet neurons), subpopulations of strongly interconnected neurons with their unique connectivity patterns in different cortical layers, and the possible role of minicolumns in the realization of the population-based maximum operation.

  20. Are galaxy distributions scale invariant? A perspective from dynamical systems theory

    NASA Astrophysics Data System (ADS)

    McCauley, J. L.

    2002-06-01

    Unless there is an evidence for fractal scaling with a single exponent over distances 0.1<=r<=100h-1Mpc, then the widely accepted notion of scale invariance of the correlation integral for 0.1<=r<=10h-1Mpc must be questioned. The attempt to extract a scaling exponent /ν from the correlation integral /n(r) by plotting /log(n(r)) vs. /log(r) is unreliable unless the underlying point set is approximately monofractal. The extraction of a spectrum of generalized dimensions νq from a plot of the correlation integral generating function Gn(q) by a similar procedure is probably an indication that Gn(q) does not scale at all. We explain these assertions after defining the term multifractal, mutually inconsistent definitions having been confused together in the cosmology literature. Part of this confusion is traced to the confusion in interpreting a measure-theoretic formula written down by Hentschel and Procaccia in the dynamical systems theory literature, while other errors follow from confusing together entirely different definitions of multifractal from two different schools of thought. Most important are serious errors in data analysis that follow from taking for granted a largest term approximation that is inevitably advertised in the literature on both fractals and dynamical systems theory.

  1. Existence of ground state of an electron in the BDF approximation

    NASA Astrophysics Data System (ADS)

    Sok, Jérémy

    2014-05-01

    The Bogoliubov-Dirac-Fock (BDF) model allows us to describe relativistic electrons interacting with the Dirac sea. It can be seen as a mean-field approximation of Quantum Electrodynamics (QED) where photons are neglected. This paper treats the case of an electron together with the Dirac sea in the absence of any external field. Such a system is described by its one-body density matrix, an infinite rank, self-adjoint operator. The parameters of the model are the coupling constant α > 0 and the ultraviolet cut-off Λ > 0: we consider the subspace of squared integrable functions made of the functions whose Fourier transform vanishes outside the ball B(0, Λ). We prove the existence of minimizers of the BDF energy under the charge constraint of one electron and no external field provided that α, Λ-1 and α log(Λ) are sufficiently small. The interpretation is the following: in this regime the electron creates a polarization in the Dirac vacuum which allows it to bind. We then study the non-relativistic limit of such a system in which the speed of light tends to infinity (or equivalently α tends to zero) with αlog(Λ) fixed: after rescaling and translation the electronic solution tends to a Choquard-Pekar ground state.

  2. Spectral Energy Distribution Fitting of Hetdex Pilot Survey Ly-alpha Emitters in Cosmos and Goods-N

    NASA Technical Reports Server (NTRS)

    Hagen, Alex; Ciardullo, Robin; Cronwall, Caryl; Acquaviva, Viviana; Bridge, Joanna; Zeimann, Gregory R.; Blanc, Guillermo; Bond, Nicholas; Finkelstein, Steven L.; Song, Mimi; hide

    2014-01-01

    We use broadband photometry extending from the rest-frame UV to the near-IR to fit the individual spectral energy distributions of 63 bright (L(Ly-alpha) greater than 10(exp 43) erg s(exp -1) Ly-alpha emitting galaxies (LAEs) in the redshift range 1.9 less than z less than 3.6. We find that these LAEs are quite heterogeneous, with stellar masses that span over three orders of magnitude, from 7.5 greater than logM/solar mass less than 10.5. Moreover, although most LAEs have small amounts of extinction, some high-mass objects have stellar reddenings as large as E(B - V ) is approximately 0.4. Interestingly, in dusty objects the optical depths for Ly-alpha and the UV continuum are always similar, indicating that Lya photons are not undergoing many scatters before escaping their galaxy. In contrast, the ratio of optical depths in low-reddening systems can vary widely, illustrating the diverse nature of the systems. Finally, we show that in the star-formation-rate-log-mass diagram, our LAEs fall above the "main-sequence" defined by z is approximately 3 continuum selected star-forming galaxies. In this respect, they are similar to submillimeter-selected galaxies, although most LAEs have much lower mass.

  3. Fractured-aquifer hydrogeology from geophysical logs: Brunswick group and Lockatong Formation, Pennsylvania

    USGS Publications Warehouse

    Morin, Roger H.; Senior, Lisa A.; Decker, Edward R.

    2000-01-01

    The Brunswick Group and the underlying Lockatong Formation are composed of lithified Mesozoic sediments that constitute part of the Newark Basin in southeastern Pennsylvania. These fractured rocks form an important regional aquifer that consists of gradational sequences of shale, siltstone, and sandstone, with fluid transport occurring primarily in fractures. An extensive suite of geophysical logs was obtained in seven wells located at the borough of Lansdale, Pennsylvania, in order to better characterize the areal hydrogeologic system and provide guidelines for the refinement of numerical ground water models. Six of the seven wells are approximately 120 m deep and the seventh extends to a depth of 335 m. Temperature, fluid conductivity, and flowmeter logs are used to locate zones of fluid exchange and to quantify transmissivities. Electrical resistivity and natural gamma logs together yield detailed stratigraphic information, and digital acoustic televiewer data provide magnetically oriented images of the borehole wall from which almost 900 fractures are identified.Analyses of the geophysical data indicate that the aquifer penetrated by the deep well can be separated into two distinct structural domains, which may, in turn, reflect different mechanical responses to basin extension by different sedimentary units:1. In the shallow zone (above 125 m), the dominant fracture population consists of gently dipping bedding plane partings that strike N46°E and dip to the northwest at about 11 degrees. Fluid flow is concentrated in the upper 80 m along these subhorizontal fractures, with transmissivities rapidly diminishing in magnitude with depth.2. The zone below 125 m marks the appearance of numerous high-angle fractures that are orthogonal to the bedding planes, striking parallel but dipping steeply southeast at 77 degrees.This secondary set of fractures is associated with a fairly thick (approximately 60 m) high-resistivity, low-transmissivity sandstone unit that is abruptly terminated by a thin shale bed at a depth of 190 m. This lower contact effectively delineates the aquifer's vertical extent at this location because no detectable evidence of ground water movement is found below it. Thus, fluid flow is controlled by fractures, but fracture type and orientation are related to lithology. Finally, a transient thermal-conduction model is successfully applied to simulate observed temperature logs, thereby confirming the effects of ground-surface warming that occurred in the area as a result of urbanization at the turn of the century. The systematic warming of the upper 120 m has increased the transmissivity of this aquifer by almost 10%, simply due to changes in fluid viscosity and density.

  4. Variability of the Degassing Flux of 4He as an impact of 4He -Dating of Groundwaters

    NASA Astrophysics Data System (ADS)

    Torgersen, T.

    2009-12-01

    4He dating of groundwater is often confounded by an external flux of 4He as the result of a crustal degassing. Estimates of this external flux have been made but what is the impact on estimates of the 4He groundwater age? The existing measures of the 4He flux across the Earth’s solid surface have been evaluated collectively. The time-and-area weighted arithmetic mean (standard deviation) of n=33 4He degassing fluxes is 3.32(±0.45) x 1010 4He atoms m-2s-1. The log normal mean of 271 measures of the flux into Precambrian shield lakes of Canada is 4.57 x 1010atoms 4He m-2s-1 with a variance of */3.9x. The log normal mean of measurements (n=33) of the crustal flux is 3.63 x 1010 4He m-2s-1 with a best estimate one sigma log normal error of */36x based on an assumption of symmetric error bars. (For comparison, the log normal mean heat flow is 62.2 mW m-2 with a log normal variance of */1.8x; the best estimate mean is 65±1.6 Wm-2, Polach et al., 1993). The variance of the continental flux is shown to increase with decreasing time scales (*/ ~106x at 0.5yr) and decreasing space scales (*/ ~106x at 1km) suggesting that the mechanisms of crustal helium transport and degassing contain a high degree of spatial and temporal variability. This best estimate of the mean and variance in the flux of 4He from continents remains approximately equivalent to the radiogenic production rate of 4He in the whole crust. The small degree of variance in the Canadian lake data (n=271), Precambrian terrain, suggests that it may represent a best approximation of “steady state” crustal degassing. Large scale vertical mass transport in continental crust is estimated as scaled values to be of the order 10-5 cm2s-1 for helium (over 2Byr and 40km vertically) vs. 10-2 cm2s-1 for heat. The mass transport rate requires not only release of 4He from the solid phase via fracturing or comminution but also an enhanced rate of mass transport facilitated by some degree of fluid advection (as has been suggested by metamorphic geology) and further imply a separation of heat and mass during transport.

  5. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  6. ASSESSING STREAM BED STABILITY AND EXCESS SEDIMENTATION IN MOUNTAIN STREAMS

    EPA Science Inventory

    Land use and resource exploitation in headwaters catchments?such as logging, mining, and road building?often increase sediment supply to streams, potentially causing excess sedimentation. Decreases in mean substrate size and increases in fine stream bed sediments can lead to inc...

  7. Survival and Metabolic Activity of Listeria monocytogenes on Ready-to-Eat Roast Beef Stored at 4 °C.

    PubMed

    Broady, Johnathan W; Han, Dong; Yuan, Jing; Liao, Chao; Bratcher, Christy L; Lilies, Mark R; Schwartz, Elizabeth H; Wang, Luxin

    2016-07-01

    Three brands of commercial roast beef were purchased and artificially inoculated with a 5-strain Listeria monocytogenes cocktail at 2 inoculation levels (approximately 3 and 6 Log CFU/g). Although all 3 brands contained sodium diacetate and sodium lactate, inoculated Listeria cocktail survived for 16 d in all 3 brands; significant increases in L. monocytogenes numbers were seen on inoculated Brand B roast beef on days 12 and 16. Numbers of L. monocytogenes increased to 4.14 Log CFU/g for the 3 Log CFU/g inoculation level and increased to 7.99 Log CFU/g for the 6 Log CFU/g inoculation level by day 16, with the pH values being 5.4 and 5.8 respectively. To measure the cell viability in potential biofilms formed, an Alamar blue assay was conducted. Brand B meat homogenate had the highest metabolic activities (P < 0.05). By comparing its metabolic activities to Brands A and C and the inoculated autoclaved meat homogenates, results indicated that the microflora present in Brand B may be the reason for high metabolic activities. Based on the denaturing gradient gel electrophoresis and the Shannon-Wiener diversity index analysis, the "Brand" factor significantly impacted the diversity index (P = 0.012) and Brand B had the highest microflora diversity (Shannon index 1.636 ± 0.011). Based on this study, results showed that antimicrobials cannot completely inhibit the growth of L. monocytogenes in ready-to-eat roast beef. Native microflora (both diversity and abundance), together with product formula, pH, antimicrobial concentrations, and storage conditions may all impact the survival and growth of L. monocytogenes. © 2016 Institute of Food Technologists®

  8. Wide-range high-resolution transmission electron microscopy reveals morphological and distributional changes of endomembrane compartments during log to stationary transition of growth phase in tobacco BY-2 cells.

    PubMed

    Toyooka, Kiminori; Sato, Mayuko; Kutsuna, Natsumaro; Higaki, Takumi; Sawaki, Fumie; Wakazaki, Mayumi; Goto, Yumi; Hasezawa, Seiichiro; Nagata, Noriko; Matsuoka, Ken

    2014-09-01

    Rapid growth of plant cells by cell division and expansion requires an endomembrane trafficking system. The endomembrane compartments, such as the Golgi stacks, endosome and vesicles, are important in the synthesis and trafficking of cell wall materials during cell elongation. However, changes in the morphology, distribution and number of these compartments during the different stages of cell proliferation and differentiation have not yet been clarified. In this study, we examined these changes at the ultrastructural level in tobacco Bright yellow 2 (BY-2) cells during the log and stationary phases of growth. We analyzed images of the BY-2 cells prepared by the high-pressure freezing/freeze substitution technique with the aid of an auto-acquisition transmission electron microscope system. We quantified the distribution of secretory and endosomal compartments in longitudinal sections of whole cells by using wide-range gigapixel-class images obtained by merging thousands of transmission electron micrographs. During the log phase, all Golgi stacks were composed of several thick cisternae. Approximately 20 vesicle clusters (VCs), including the trans-Golgi network and secretory vesicle cluster, were observed throughout the cell. In the stationary-phase cells, Golgi stacks were thin with small cisternae, and only a few VCs were observed. Nearly the same number of multivesicular body and small high-density vesicles were observed in both the stationary and log phases. Results from electron microscopy and live fluorescence imaging indicate that the morphology and distribution of secretory-related compartments dramatically change when cells transition from log to stationary phases of growth. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Al(III), Pd(II), and Zn(II) phthalocyanines for inactivation of dental pathogen Aggregatibacter actinomycetemcomitans as planktonic and biofilm-cultures

    NASA Astrophysics Data System (ADS)

    Kussovski, V.; Mantareva, V.; Angelov, I.; Avramov, L.; Popova, E.; Dimitrov, S.

    2012-06-01

    The Gram-negative, oral bacterium Aggregatibacter actinomycetemcomitans has been implicated as the causative agent of several forms of periodontal disease in humans. The new periodontal disease treatments are emergence in order to prevent infection progression. Antimicrobial photodynamic therapy (a-PDT) can be a useful tool for this purpose. It involves the use of light of specific wavelength to activate a nontoxic photosensitizing agent in the presence of oxygen for eradication of target cells, and appears effective in photoinactivation of microorganisms. The phthalocyanine metal complexes of Pd(II)- (PdPcC) and Al(III)- (AlPc1) were evaluated as photodynamic sensitizers towards a dental pathogen A. actinomycetemcomitans in comparison to the known methylpyridyloxy-substituted Zn(II) phthalocyanine (ZnPcMe). The planktonic and biofilm-cultivated species of A. actinomycetemcomitans were treated. The photophysical results showed intensive and far-red absorbance with high tendency of aggregation for Pd(II)-phthalocyanine. The dark toxicities of both photosensitizers were negligible at concentrations used (< 0.5 log decrease of viable cells). The photodynamic response for planktonic cultured bacteria was full photoinactivation after a-PDT with ZnPcMe. In case of the newly studied complexes, the effect was lower for PdPcC (4 log) as well as for AlPc1 (1.5-2 log). As it is known the bacterial biofilms were more resistant to a-PDT, which was confirmed for A. actinomycetemcomitans biofilms with 3 log reductions of viable cells after treatment with ZnPcMe and approximately 1 log reduction of biofilms after PdPcC and AlPc1. The initial results suggest that a-PDT can be useful for effective inactivation of dental pathogen A. actinomycetemcomitans.

  10. Survival and transfer of murine norovirus 1, a surrogate for human noroviruses, during the production process of deep-frozen onions and spinach.

    PubMed

    Baert, Leen; Uyttendaele, Mieke; Vermeersch, Mattias; Van Coillie, Els; Debevere, Johan

    2008-08-01

    The reduction of murine norovirus 1 (MNV-1) on onions and spinach by washing was investigated as was the risk of contamination during the washing procedure. To decontaminate wash water, the industrial sanitizer peracetic acid (PAA) was added to the water, and the survival of MNV-1 was determined. In contrast to onions, spinach undergoes a heat treatment before freezing. Therefore, the resistance of MNV-1 to blanching of spinach was examined. MNV-1 genomic copies were detected with a real-time reverse transcription PCR assay in PAA-treated water and blanched spinach, and PFUs (representing infectious MNV-1 units) were determined with a plaque assay. A < or = 1-log reduction in MNV-1 PFUs was achieved by washing onion bulbs and spinach leaves. More than 3 log PFU of MNV-1 was transmitted to onion bulbs and spinach leaves when these vegetables were washed in water containing approximately 5 log PFU/ml. No decline of MNV-1 occurred in used industrial spinach wash water after 6 days at room temperature. A concentration of 20 ppm of PAA in demineralized water (pH 4.13) and in potable water (pH 7.70) resulted in reductions of 2.88 +/- 0.25 and 2.41 +/- 0.18 log PFU, respectively, after 5 min of exposure, but no decrease in number of genomic copies was observed. No reduction of MNV-1 PFUs was observed on frozen onions or spinach during storage for 6 months. Blanching spinach (80 degrees C for 1 min) resulted in at least 2.44-log reductions of infectious MNV-1, but many genomic copies were still present.

  11. Destruction of Listeria monocytogenes in sturgeon (Acipenser transmontanus) caviar by a combination of nisin with chemical antimicrobials or moderate heat.

    PubMed

    Al-Holy, M; Lin, M; Rasco, B

    2005-03-01

    The objective of this study was to investigate the effect of nisin in combination with heat or antimicrobial chemical treatments (such as lactic acid, chlorous acid, and sodium hypochlorite) on the inhibition of Listeria monocytogenes and total mesophiles in sturgeon (Acipenser transmontanus) caviar. The effects of nisin (250, 500, 750, and 1,000 IU/ml), lactic acid (1, 2, and 3%), chlorous acid (134 and 268 ppm), sodium hypochlorite (150 and 300 ppm), and heat at 60 degrees C for 3 min were evaluated for a five-strain mixture of L. monocytogenes and total mesophiles in sturgeon caviar containing 3.5% salt. Selected combinations of these antimicrobial treatments were also tested. Injured and viable L. monocytogenes cells were recovered using an overlay method. Treating caviar with > or =500 IU/ml nisin initially reduced L. monocytogenes by 2 to 2.5 log units. Chlorous acid (268 ppm) reduced L. monocytogenes from 7.7 log units to undetectable (<0.48 log units) after 4 days of storage at 4 degrees C. However, there were no synergistic effects observed for combinations of nisin (500 or 750 IU/ml) plus either lactic acid or chlorous acid. Lactic acid caused a slight reduction (approximately 1 log unit) in the microbial load during a 6-day period at 4 degrees C. Sodium hypochlorite was ineffective at the levels tested. Mild heating (60 degrees C for 3 min) with nisin synergistically reduced viable counts of L. monocytogenes and total mesophiles. No L. monocytogenes cells (<0.48 log units) were recovered from caviar treated with heat and nisin (750 IU/ml) after a storage period of 28 days at 4 degrees C.

  12. Erratum to: Constraining couplings of top quarks to the Z boson in $$ t\\overline{t} $$ + Z production at the LHC

    DOE PAGES

    Röntsch, Raoul; Schulze, Markus

    2015-09-21

    We study top quark pair production in association with a Z boson at the Large Hadron Collider (LHC) and investigate the prospects of measuring the couplings of top quarks to the Z boson. To date these couplings have not been constrained in direct measurements. Such a determination will be possible for the first time at the LHC. Our calculation improves previous coupling studies through the inclusion of next-to-leading order (NLO) QCD corrections in production and decays of all unstable particles. We treat top quarks in the narrow-width approximation and retain all NLO spin correlations. To determine the sensitivity of amore » coupling measurement we perform a binned log-likelihood ratio test based on normalization and shape information of the angle between the leptons from the Z boson decay. The obtained limits account for statistical uncertainties as well as leading theoretical systematics from residual scale dependence and parton distribution functions. We use current CMS data to place the first direct constraints on the ttbZ couplings. We also consider the upcoming high-energy LHC run and find that with 300 inverse fb of data at an energy of 13 TeV the vector and axial ttbZ couplings can be constrained at the 95% confidence level to C_V=0.24^{+0.39}_{-0.85} and C_A=-0.60^{+0.14}_{-0.18}, where the central values are the Standard Model predictions. This is a reduction of uncertainties by 25% and 42%, respectively, compared to an analysis based on leading-order predictions. We also translate these results into limits on dimension-six operators contributing to the ttbZ interactions beyond the Standard Model.« less

  13. Radiation exposure risks to nuclear well loggers.

    PubMed

    Fujimoto, K; Wilson, J A; Ashmore, J P

    1985-04-01

    This report is based on statistical data from the Canadian National Dose Registry (As82) and information obtained from visits to 1 supplier and 9 oil-well service companies in the Province of Alberta. The companies are representative of most in this industry and provide services at the well head from logging, perforating and fracturing to cementing and tracer work. The information obtained indicates that typical exposures can account for an average dose of 1 to 2 mSv/y. The observations of well-logging procedures revealed a number of potentially hazardous situations which could lead to unnecessary exposure and based upon these, several recommendations are included.

  14. The theory of maximally and minimally even sets, the one- dimensional antiferromagnetic Ising model, and the continued fraction compromise of musical scales

    NASA Astrophysics Data System (ADS)

    Douthett, Elwood (Jack) Moser, Jr.

    1999-10-01

    Cyclic configurations of white and black sites, together with convex (concave) functions used to weight path length, are investigated. The weights of the white set and black set are the sums of the weights of the paths connecting the white sites and black sites, respectively, and the weight between sets is the sum of the weights of the paths that connect sites opposite in color. It is shown that when the weights of all configurations of a fixed number of white and a fixed number of black sites are compared, minimum (maximum) weight of a white set, minimum (maximum) weight of the a black set, and maximum (minimum) weight between sets occur simultaneously. Such configurations are called maximally even configurations. Similarly, the configurations whose weights are the opposite extremes occur simultaneously and are called minimally even configurations. Algorithms that generate these configurations are constructed and applied to the one- dimensional antiferromagnetic spin-1/2 Ising model. Next the goodness of continued fractions as applied to musical intervals (frequency ratios and their base 2 logarithms) is explored. It is shown that, for the intermediate convergents between two consecutive principal convergents of an irrational number, the first half of the intermediate convergents are poorer approximations than the preceding principal convergent while the second half are better approximations; the goodness of a middle intermediate convergent can only be determined by calculation. These convergents are used to determine what equal-tempered systems have intervals that most closely approximate the musical fifth (pn/ qn = log2(3/2)). The goodness of exponentiated convergents ( 2pn/qn~3/2 ) is also investigated. It is shown that, with the exception of a middle convergent, the goodness of the exponential form agrees with that of its logarithmic Counterpart As in the case of the logarithmic form, the goodness of a middle intermediate convergent in the exponential form can only be determined by calculation. A Desirability Function is constructed that simultaneously measures how well multiple intervals fit in a given equal-tempered system. These measurements are made for octave (base 2) and tritave systems (base 3). Combinatorial properties important to music modulation are considered. These considerations lead These considerations lead to the construction of maximally even scales as partitions of an equal-tempered system.

  15. Formulation Studies of InhA inhibitors and Combination therapy to improve efficacy against Mycobacterium tuberculosis

    PubMed Central

    Knudson, Susan E.; Cummings, Jason E.; Bommineni, Gopal R.; Pan, Pan; Tonge, Peter J.; Slayden, Richard A.

    2016-01-01

    Summary Previously, structure-based drug design was used to develop substituted diphenyl ethers with potency against the Mycobacterium tuberculosis (Mtb) enoyl-ACP reductase (InhA), however, the highly lipophilic centroid compound, SB-PT004, lacked sufficient efficacy in the acute murine Mtb infection model. A next generation series of compounds were designed with improved specificity, potency against InhA, and reduced cytotoxicity in vitro, but these compounds also had limited solubility. Accordingly, solubility and pharmacokinetics studies were performed to develop formulations for this class and other experimental drug candidates with high logP values often encountered in drug discovery. Lead diphenyl ethers were formulated in co-solvent and Self-Dispersing Lipid Formulations (SDLFs) and evaluated in a rapid murine Mtb infection model that assesses dissemination to and bacterial burden in the spleen. In vitro synergy studies were performed with the lead diphenyl ether compounds, SB-PT070 and SB-PT091, and rifampin (RIF), which demonstrated an additive effect, and that guided the in vivo studies. Combinatorial therapy in vivo studies with these compounds delivered in our Self-Micro Emulsifying Drug Delivery System (SMEDDS) resulted in an additional 1.4 log10 CFU reduction in the spleen of animals co-treated with SB-PT091 and RIF and an additional 1.7 log10 reduction in the spleen with animals treated with both SB-PT070 and RIF. PMID:27865404

  16. Stan : A Probabilistic Programming Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  17. Is forest management a significant source of monoterpenes into the boreal atmosphere?

    NASA Astrophysics Data System (ADS)

    Haapanala, S.; Hakola, H.; Hellén, H.; Vestenius, M.; Levula, J.; Rinne, J.

    2012-04-01

    Volatile organic compounds (VOCs) including terpenoids are emitted into the atmosphere from various natural sources. Damaging the plant tissue is known to strongly increase their monoterpene release. We measured the terpenoid emissions caused by timber felling, i.e. those from stumps and logging residue. The emissions from stumps were studied using enclosures and those from the whole felling area using an ecosystem-scale micrometeorological method, disjunct eddy accumulation (DEA). The compounds analyzed were isoprene, monoterpenes and sesquiterpenes. Strong emissions of monoterpenes were measured from both the stumps and from the whole felling area. The emission rate decreased rapidly within a few months after the logging. In addition to fresh logging residue, the results suggest also other strong monoterpene sources may be present in the felling area. These could include pre-existing litter, increased microbial activity and remaining undergrowth. In order to evaluate the possible importance of monoterpenes emitted annually from cut Scots pine forests in Finland, we conducted a rough upscaling calculation. The resulting monoterpene release was approximated to be on the order of 15 kilotonnes per year, which corresponds to about one tenth of the monoterpene release from intact forests in Finland.

  18. VizieR Online Data Catalog: Solar analogs and twins rotation by Kepler (do Nascimento+, 2014)

    NASA Astrophysics Data System (ADS)

    Do Nascimento, J.-D. Jr; Garcia, R. A.; Mathur, S.; Anthony, F.; Barnes, S. A.; Meibom, S.; da Costa, J. S.; Castro, M.; Salabert, D.; Ceillier, T.

    2017-03-01

    Our sample of 75 stars consists of a seismic sample of 38 from Chaplin et al. (2014, J/ApJS/210/1), 35 additional stars selected from the Kepler Input Catalog (KIC), and 16 Cyg A and B. We selected 38 well-studied stars from the asteroseismic data with fundamental properties, including ages, estimated by Chaplin et al. (2014, J/ApJS/210/1), and with Teff and log g as close as possible to the Sun's value (5200 K < Teff < 6060 K and 3.63 < log g < 4.40). This seismic sample allows a direct comparison between gyro- and seismic-ages for a subset of eight stars. These seismic samples were observed in short cadence for one month each in survey mode. Stellar properties for these stars have been estimated using two global asteroseismic parameters and complementary photometric and spectroscopic observations as described by Chaplin et al. (2014, J/ApJS/210/1). The median final quoted uncertainties for the full Chaplin et al. (2014, J/ApJS/210/1) sample were approximately 0.020 dex in log g and 150 K in Teff. (1 data file).

  19. Asymmetry of Reinforcement and Punishment in Human Choice

    PubMed Central

    Rasmussen, Erin B; Newland, M Christopher

    2008-01-01

    The hypothesis that a penny lost is valued more highly than a penny earned was tested in human choice. Five participants clicked a computer mouse under concurrent variable-interval schedules of monetary reinforcement. In the no-punishment condition, the schedules arranged monetary gain. In the punishment conditions, a schedule of monetary loss was superimposed on one response alternative. Deviations from generalized matching using the free parameters c (sensitivity to reinforcement) and log k (bias) were compared in the no-punishment and punishment conditions. The no-punishment conditions yielded values of log k that approximated zero for all participants, indicating no bias. In the punishment condition, values of log k deviated substantially from zero, revealing a 3-fold bias toward the unpunished alternative. Moreover, the c parameters were substantially smaller in punished conditions. The values for bias and sensitivity under punishment did not change significantly when the measure of net reinforcers (gains minus losses) was applied to the analysis. These results mean that punishment reduced the sensitivity of behavior to reinforcement and biased performance toward the unpunished alternative. We concluded that a single punisher subtracted more value than a single reinforcer added, indicating an asymmetry in the law of effect. PMID:18422016

  20. Bubble-chip analysis of human origin distributions demonstrates on a genomic scale significant clustering into zones and significant association with transcription

    PubMed Central

    Mesner, Larry D.; Valsakumar, Veena; Karnani, Neerja; Dutta, Anindya; Hamlin, Joyce L.; Bekiranov, Stefan

    2011-01-01

    We have used a novel bubble-trapping procedure to construct nearly pure and comprehensive human origin libraries from early S- and log-phase HeLa cells, and from log-phase GM06990, a karyotypically normal lymphoblastoid cell line. When hybridized to ENCODE tiling arrays, these libraries illuminated 15.3%, 16.4%, and 21.8% of the genome in the ENCODE regions, respectively. Approximately half of the origin fragments cluster into zones, and their signals are generally higher than those of isolated fragments. Interestingly, initiation events are distributed about equally between genic and intergenic template sequences. While only 13.2% and 14.0% of genes within the ENCODE regions are actually transcribed in HeLa and GM06990 cells, 54.5% and 25.6% of zonal origin fragments overlap transcribed genes, most with activating chromatin marks in their promoters. Our data suggest that cell synchronization activates a significant number of inchoate origins. In addition, HeLa and GM06990 cells activate remarkably different origin populations. Finally, there is only moderate concordance between the log-phase HeLa bubble map and published maps of small nascent strands for this cell line. PMID:21173031

  1. Stan : A Probabilistic Programming Language

    DOE PAGES

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...

    2017-01-01

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  2. Performance of NucliSens HIV-1 EasyQ Version 2.0 compared with six commercially available quantitative nucleic acid assays for detection of HIV-1 in China.

    PubMed

    Xu, Sihong; Song, Aijing; Nie, Jianhui; Li, Xiuhua; Wang, Youchun

    2010-10-01

    Six HIV-1 viral load assays have been widely used in China. These include the Cobas Amplicor HIV-1 Monitor Version 1.5 ('Amplicor'), Cobas AmpliPrep/Cobas TaqMan HIV-1 test Version 1.0 ('CAP/CTM'), Versant HIV-1 RNA Version 3.0 (branched DNA [bDNA]-based assay; 'Versant bDNA'), Abbott RealTime HIV-1 assay ('Abbott RealTime'), NucliSens HIV-1 QT (nucleic acid sequence-based amplification assay; 'NucliSens NASBA'), and NucliSens EasyQ HIV-1 Version 1.1 ('EasyQ V1.1'). Recently, an updated version of EasyQ V1.1, NucliSens EasyQ HIV-1 Version 2.0 ('EasyQ V2.0') was introduced into China. It is important to evaluate the impact of HIV-1 genotypes on the updated assay compared with the other commercial available assays in China. A total of 175 plasma samples with different HIV-1 clades prevalent in China were collected from treatment-naïve patients. The viral loads of those samples were determined with the seven HIV-1 viral load assays, and the quantitative differences between them were evaluated. Overall, EasyQ V2.0 exhibited a significant correlation (R = 0.769-0.850, p ≤ 0.001) and high agreement (94.77-97.13%, using the Bland-Altman model) with the other six assays. Although no significant differences between EasyQ V2.0 and the other six assays were observed when quantifying clade B' samples, there were statistically significant differences between EasyQ V2.0 and the Amplicor, Versant bDNA, and Abbott RealTime assays when quantifying clade BC samples, and between EasyQ V2.0 and the Versant bDNA and Abbott RealTime assays when quantifying clade AE samples. For clade BC samples, the quantitative differences between EasyQ V2.0 and the Amplicor, Versant bDNA, and Abbott RealTime assays exceeded 0.5 log(10) IU/mL in approximately 50% of samples and exceeded 1 log(10) IU/mL in approximately 15% of samples. For clade AE samples, the quantitative differences between EasyQ V2.0 and the CAP/CTM, Versant bDNA, and Abbott RealTime assays exceeded 0.5 log(10) IU/mL in approximately 50% of samples, and the differences between EasyQ V2.0 and CAP/CTM exceeded 1 log(10) IU/mL in approximately 15% of samples. Genotypes may affect the quantification of HIV-1 RNA, especially in clade BC samples with respect to EasyQ V2.0 and the Amplicor, Versant bDNA, or Abbott RealTime assays, and in clade AE samples with respect to EasyQ V2.0 and the Versant bDNA or Abbott RealTime assays. It is therefore strongly suggested that, where possible, the HIV-1 viral load in infected patients be quantified at follow-up by the same version of the same assay that was used initially.

  3. Propulsion simulation test technique for V/STOL configurations

    NASA Technical Reports Server (NTRS)

    Bailey, R. O.; Smith, S. C.; Bustie, J. B.

    1983-01-01

    Ames Research Center is developing the technology for turbine-powered jet engine simulators so that airframe/propulsion system interactions on V/STOL fighter aircraft and other highly integrated configurations can be studied. This paper describes the status of the compact multimission aircraft propulsion simulator (CMAPS) technology. Three CMAPS units have accumulated a total of 340 hr during approximately 1-1/2 yr of static and wind-tunnel testing. A wind-tunnel test of a twin-engine CMAPS-equipped close-coupled canard-wing V/STOL model configuration with nonaxisymmetric nozzles was recently completed. During this test approximately 140 total hours were logged on two CMAPS units, indicating that the rotating machinery is reliable and that the CMAPS and associated control system provide a usable test tool. However, additional development is required to correct a drive manifold O-ring problem that limits the engine-pressure-ratio (EPR) to approximately 3.5.

  4. An Analysis of Polynomial Chaos Approximations for Modeling Single-Fluid-Phase Flow in Porous Medium Systems

    PubMed Central

    Rupert, C.P.; Miller, C.T.

    2008-01-01

    We examine a variety of polynomial-chaos-motivated approximations to a stochastic form of a steady state groundwater flow model. We consider approaches for truncating the infinite dimensional problem and producing decoupled systems. We discuss conditions under which such decoupling is possible and show that to generalize the known decoupling by numerical cubature, it would be necessary to find new multivariate cubature rules. Finally, we use the acceleration of Monte Carlo to compare the quality of polynomial models obtained for all approaches and find that in general the methods considered are more efficient than Monte Carlo for the relatively small domains considered in this work. A curse of dimensionality in the series expansion of the log-normal stochastic random field used to represent hydraulic conductivity provides a significant impediment to efficient approximations for large domains for all methods considered in this work, other than the Monte Carlo method. PMID:18836519

  5. Approximation methods for control of structural acoustics models with piezoceramic actuators

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Fang, W.; Silcox, R. J.; Smith, R. C.

    1993-01-01

    The active control of acoustic pressure in a 2-D cavity with a flexible boundary (a beam) is considered. Specifically, this control is implemented via piezoceramic patches on the beam which produces pure bending moments. The incorporation of the feedback control in this manner leads to a system with an unbounded input term. Approximation methods in this manner leads to a system with an unbounded input term. Approximation methods in this manner leads to a system with an unbounded input team. Approximation methods in the context of linear quadratic regulator (LQR) state space control formulation are discussed and numerical results demonstrating the effectiveness of this approach in computing feedback controls for noise reduction are presented.

  6. Prenatal Lead Exposure and Fetal Growth: Smaller Infants Have Heightened Susceptibility

    PubMed Central

    Rodosthenous, Rodosthenis S.; Burris, Heather H.; Svensson, Katherine; Amarasiriwardena, Chitra J.; Cantoral, Alejandra; Schnaas, Lourdes; Mercado-García, Adriana; Coull, Brent A.; Wright, Robert O.; Téllez-Rojo, Martha M.; Baccarelli, Andrea A.

    2016-01-01

    Background As population lead levels decrease, the toxic effects of lead may be distributed to more sensitive populations, such as infants with poor fetal growth. Objectives To determine the association of prenatal lead exposure and fetal growth; and to evaluate whether infants with poor fetal growth are more susceptible to lead toxicity than those with normal fetal growth. Methods We examined the association of second trimester maternal blood lead levels (BLL) with birthweight-for-gestational age (BWGA) z-score in 944 mother-infant participants of the PROGRESS cohort. We determined the association between maternal BLL and BWGA z-score by using both linear and quantile regression. We estimated odds ratios for small-for-gestational age (SGA) infants between maternal BLL quartiles using logistic regression. Maternal age, body mass index, socioeconomic status, parity, household smoking exposure, hemoglobin levels, and infant sex were included as confounders. Results While linear regression showed a negative association between maternal BLL and BWGA z-score (β=−0.06 z-score units per log2 BLL increase; 95% CI: −0.13, 0.003; P=0.06), quantile regression revealed larger magnitudes of this association in the <30th percentiles of BWGA z-score (β range [−0.08, −0.13] z-score units per log2 BLL increase; all P values <0.05). Mothers in the highest BLL quartile had an odds ratio of 1.62 (95% CI: 0.99–2.65) for having a SGA infant compared to the lowest BLL quartile. Conclusions While both linear and quantile regression showed a negative association between prenatal lead exposure and birthweight, quantile regression revealed that smaller infants may represent a more susceptible subpopulation. PMID:27923585

  7. Prenatal lead exposure and fetal growth: Smaller infants have heightened susceptibility.

    PubMed

    Rodosthenous, Rodosthenis S; Burris, Heather H; Svensson, Katherine; Amarasiriwardena, Chitra J; Cantoral, Alejandra; Schnaas, Lourdes; Mercado-García, Adriana; Coull, Brent A; Wright, Robert O; Téllez-Rojo, Martha M; Baccarelli, Andrea A

    2017-02-01

    As population lead levels decrease, the toxic effects of lead may be distributed to more sensitive populations, such as infants with poor fetal growth. To determine the association of prenatal lead exposure and fetal growth; and to evaluate whether infants with poor fetal growth are more susceptible to lead toxicity than those with normal fetal growth. We examined the association of second trimester maternal blood lead levels (BLL) with birthweight-for-gestational age (BWGA) z-score in 944 mother-infant participants of the PROGRESS cohort. We determined the association between maternal BLL and BWGA z-score by using both linear and quantile regression. We estimated odds ratios for small-for-gestational age (SGA) infants between maternal BLL quartiles using logistic regression. Maternal age, body mass index, socioeconomic status, parity, household smoking exposure, hemoglobin levels, and infant sex were included as confounders. While linear regression showed a negative association between maternal BLL and BWGA z-score (β=-0.06 z-score units per log 2 BLL increase; 95% CI: -0.13, 0.003; P=0.06), quantile regression revealed larger magnitudes of this association in the <30th percentiles of BWGA z-score (β range [-0.08, -0.13] z-score units per log 2 BLL increase; all P values<0.05). Mothers in the highest BLL quartile had an odds ratio of 1.62 (95% CI: 0.99-2.65) for having a SGA infant compared to the lowest BLL quartile. While both linear and quantile regression showed a negative association between prenatal lead exposure and birthweight, quantile regression revealed that smaller infants may represent a more susceptible subpopulation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Resumming the large-N approximation for time evolving quantum systems

    NASA Astrophysics Data System (ADS)

    Mihaila, Bogdan; Dawson, John F.; Cooper, Fred

    2001-05-01

    In this paper we discuss two methods of resumming the leading and next to leading order in 1/N diagrams for the quartic O(N) model. These two approaches have the property that they preserve both boundedness and positivity for expectation values of operators in our numerical simulations. These approximations can be understood either in terms of a truncation to the infinitely coupled Schwinger-Dyson hierarchy of equations, or by choosing a particular two-particle irreducible vacuum energy graph in the effective action of the Cornwall-Jackiw-Tomboulis formalism. We confine our discussion to the case of quantum mechanics where the Lagrangian is L(x,ẋ)=(12)∑Ni=1x˙2i-(g/8N)[∑Ni=1x2i- r20]2. The key to these approximations is to treat both the x propagator and the x2 propagator on similar footing which leads to a theory whose graphs have the same topology as QED with the x2 propagator playing the role of the photon. The bare vertex approximation is obtained by replacing the exact vertex function by the bare one in the exact Schwinger-Dyson equations for the one and two point functions. The second approximation, which we call the dynamic Debye screening approximation, makes the further approximation of replacing the exact x2 propagator by its value at leading order in the 1/N expansion. These two approximations are compared with exact numerical simulations for the quantum roll problem. The bare vertex approximation captures the physics at large and modest N better than the dynamic Debye screening approximation.

  9. ELASTIC NET FOR COX’S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM

    PubMed Central

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox’s proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox’s proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems. PMID:23226932

  10. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for Signal-to-Noise Ratio and Log Likelihood Ratio

    DOE PAGES

    Polcari, J.

    2013-08-16

    The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less

  11. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge.

    PubMed

    Hollon, Matthew F

    2015-01-01

    By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents.

  12. Digital signal processing and interpretation of full waveform sonic log for well BP-3-USGS, Great Sand Dunes National Park and Preserve, Alamosa County, Colorado

    USGS Publications Warehouse

    Burke, Lauri

    2011-01-01

    Along the Great Sand Dunes National Park and Preserve boundary (fig. 1), 10 monitoring wells were drilled by the National Park Service in order to monitor water flow in an unconfined aquifer spanning the park boundary. Adjacent to the National Park Service monitoring well named Boundary Piezometer Well No. 3, or BP-3, the U.S. Geological Survey (USGS) drilled the BP-3-USGS well. This well was drilled from September 14 through 17, 2009, to a total depth of 99.4 meters (m) in order to acquire additional subsurface information. The BP-3-USGS well is located at lat 37 degrees 43'18.06' and long -105 degrees 43'39.30' at a surface elevation of 2,301 m. Approximately 23 m of core was recovered beginning at a depth of 18 m. Drill cuttings were also recovered. The wireline geophysical logs acquired in the well include natural gamma ray, borehole caliper, temperature, full waveform sonic, density, neutron, resistivity, and induction logs. The BP-3-USGS well is now plugged and abandoned. This report details the full waveform digital signal processing methodology and the formation compressional-wave velocities determined for the BP-3-USGS well. These velocity results are compared to several velocities that are commonly encountered in the subsurface. The density log is also discussed in context of these formation velocities.

  13. Correlating Petrophysical Well Logs Using Fractal-based Analysis to Identify Changes in the Signal Complexity Across Neutron, Density, Dipole Sonic, and Gamma Ray Tool Types

    NASA Astrophysics Data System (ADS)

    Matthews, L.; Gurrola, H.

    2015-12-01

    Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti, Mike D. Dentith, and Ron D. List, (1996). A fractal-based algorithm for detecting first arrivals on seismic traces. Geophysics, Vol.61, No.4, P. 1095-1102.

  14. The challenges of developing a contrast-based video game for treatment of amblyopia

    PubMed Central

    Hussain, Zahra; Astle, Andrew T.; Webb, Ben S.; McGraw, Paul V.

    2014-01-01

    Perceptual learning of visual tasks is emerging as a promising treatment for amblyopia, a developmental disorder of vision characterized by poor monocular visual acuity. The tasks tested thus far span the gamut from basic psychophysical discriminations to visually complex video games. One end of the spectrum offers precise control over stimulus parameters, whilst the other delivers the benefits of motivation and reward that sustain practice over long periods. Here, we combined the advantages of both approaches by developing a video game that trains contrast sensitivity, which in psychophysical experiments, is associated with significant improvements in visual acuity in amblyopia. Target contrast was varied adaptively in the game to derive a contrast threshold for each session. We tested the game on 20 amblyopic subjects (10 children and 10 adults), who played at home using their amblyopic eye for an average of 37 sessions (approximately 11 h). Contrast thresholds from the game improved reliably for adults but not for children. However, logMAR acuity improved for both groups (mean = 1.3 lines; range = 0–3.6 lines). We present the rationale leading to the development of the game and describe the challenges of incorporating psychophysical methods into game-like settings. PMID:25404922

  15. The challenges of developing a contrast-based video game for treatment of amblyopia.

    PubMed

    Hussain, Zahra; Astle, Andrew T; Webb, Ben S; McGraw, Paul V

    2014-01-01

    Perceptual learning of visual tasks is emerging as a promising treatment for amblyopia, a developmental disorder of vision characterized by poor monocular visual acuity. The tasks tested thus far span the gamut from basic psychophysical discriminations to visually complex video games. One end of the spectrum offers precise control over stimulus parameters, whilst the other delivers the benefits of motivation and reward that sustain practice over long periods. Here, we combined the advantages of both approaches by developing a video game that trains contrast sensitivity, which in psychophysical experiments, is associated with significant improvements in visual acuity in amblyopia. Target contrast was varied adaptively in the game to derive a contrast threshold for each session. We tested the game on 20 amblyopic subjects (10 children and 10 adults), who played at home using their amblyopic eye for an average of 37 sessions (approximately 11 h). Contrast thresholds from the game improved reliably for adults but not for children. However, logMAR acuity improved for both groups (mean = 1.3 lines; range = 0-3.6 lines). We present the rationale leading to the development of the game and describe the challenges of incorporating psychophysical methods into game-like settings.

  16. Facing Stormwater Management Challenges at a Southeastern Army Installation: US Army Garrison Fort Gordon

    DTIC Science & Technology

    2012-05-24

    Medical Center • Host to Army, Navy, Air Force, Marines and multi- national forces • Supporting 17,950 military and 6,710 civilians 4 Fort Gordon...comply with the NPne:s ( National Pol utant Discharge Elimination System) Phase II stormwater oermit and reduce pollutants fourld in stonnwatar...Lane Avenue: • Approximately 50 ft drop • Stormwater management (constructed riffle ) • Streambank stabilization (geolift, log vane) 19

  17. Technologically Enhanced Naturally Occurring Radioactive Materials (TENORM) in the Oil and Gas Industry: A Review.

    PubMed

    Doyi, Israel; Essumang, David Kofi; Dampare, Samuel; Glover, Eric Tetteh

    Radiation is part of the natural environment: it is estimated that approximately 80 % of all human exposure comes from naturally occurring or background radiation. Certain extractive industries such as mining and oil logging have the potential to increase the risk of radiation exposure to the environment and humans by concentrating the quantities of naturally occurring radiation beyond normal background levels (Azeri-Chirag-Gunashli 2004).

  18. Power laws in microrheology experiments on living cells: Comparative analysis and modeling.

    PubMed

    Balland, Martial; Desprat, Nicolas; Icard, Delphine; Féréol, Sophie; Asnacios, Atef; Browaeys, Julien; Hénon, Sylvie; Gallet, François

    2006-08-01

    We compare and synthesize the results of two microrheological experiments on the cytoskeleton of single cells. In the first one, the creep function J(t) of a cell stretched between two glass plates is measured after applying a constant force step. In the second one, a microbead specifically bound to transmembrane receptors is driven by an oscillating optical trap, and the viscoelastic coefficient Ge(omega) is retrieved. Both J(t) and Ge(omega) exhibit power law behaviors: J(t) = A0(t/t0)alpha and absolute value (Ge(omega)) = G0(omega/omega0)alpha, with the same exponent alpha approximately 0.2. This power law behavior is very robust; alpha is distributed over a narrow range, and shows almost no dependence on the cell type, on the nature of the protein complex which transmits the mechanical stress, nor on the typical length scale of the experiment. On the contrary, the prefactors A0 and G0 appear very sensitive to these parameters. Whereas the exponents alpha are normally distributed over the cell population, the prefactors A0 and G0 follow a log-normal repartition. These results are compared with other data published in the literature. We propose a global interpretation, based on a semiphenomenological model, which involves a broad distribution of relaxation times in the system. The model predicts the power law behavior and the statistical repartition of the mechanical parameters, as experimentally observed for the cells. Moreover, it leads to an estimate of the largest response time in the cytoskeletal network: tau(m) approximately 1000 s.

  19. Simulation of variation of apparent resistivity in resistivity surveys using finite difference modelling with Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Aguirre, E. E.; Karchewski, B.

    2017-12-01

    DC resistivity surveying is a geophysical method that quantifies the electrical properties of the subsurface of the earth by applying a source current between two electrodes and measuring potential differences between electrodes at known distances from the source. Analytical solutions for a homogeneous half-space and simple subsurface models are well known, as the former is used to define the concept of apparent resistivity. However, in situ properties are heterogeneous meaning that simple analytical models are only an approximation, and ignoring such heterogeneity can lead to misinterpretation of survey results costing time and money. The present study examines the extent to which random variations in electrical properties (i.e. electrical conductivity) affect potential difference readings and therefore apparent resistivities, relative to an assumed homogeneous subsurface model. We simulate the DC resistivity survey using a Finite Difference (FD) approximation of an appropriate simplification of Maxwell's equations implemented in Matlab. Electrical resistivity values at each node in the simulation were defined as random variables with a given mean and variance, and are assumed to follow a log-normal distribution. The Monte Carlo analysis for a given variance of electrical resistivity was performed until the mean and variance in potential difference measured at the surface converged. Finally, we used the simulation results to examine the relationship between variance in resistivity and variation in surface potential difference (or apparent resistivity) relative to a homogeneous half-space model. For relatively low values of standard deviation in the material properties (<10% of mean), we observed a linear correlation between variance of resistivity and variance in apparent resistivity.

  20. Vanadium Partitioning and Mantle Oxidation State: New Experimental Data

    NASA Astrophysics Data System (ADS)

    Mallmann, G.; O'Neill, H. S.

    2007-12-01

    Vanadium exists in multiple valences in natural basaltic melts, namely V2+, V3+, V4+ and V5+. Because most crystalline phases prefer to incorporate V3+ rather than V4+ and V5+, the crystal/silicate-melt partitioning of vanadium (DVcry/melt) tends to decrease with increasing oxygen fugacity (fO2). Such dependence has been experimentally demonstrated and used to estimate the fO2 of mantle and mantle-derived rocks. Recent modelling of V and V/Sc systematics in basalts has lead to the view that the relative fO2 of the upper mantle is constant, both through time and among the sources of different types of basaltic magmas (i.e. MORB, OIB and IAB). This is contrary to the notion given by other oxygen barometric methods on peridotites and basalts, which indicate an upper mantle heterogeneous in relative fO2. To explore further the potential of V abundances and V/Sc ratios to estimate the relative fO2 of mantle peridotites and basalts, and in particular to understand variations in mantle oxidation state better, we carried out an experimental campaign aimed at measuring DVcry/melt for all the relevant phases of the upper mantle (i.e. olivine, orthopyroxene, clinopyroxene, garnet and spinel) over a range of fO2 conditions large enough to pin down not only the behaviour of V3+ and V4+ but also V2+ and V5+. Experiments were done in 1-atm vertical tube furnaces (1300°C) and piston-cylinder apparatus (1275-1450°C and 1.5-3.2 GPa). For the high-pressure experiments, fO2 was controlled by the Re-ReOx/2 equilibrium (10-9 to 10-0.7 bar), whereas for the 1-atm experiments, fO2 was controlled by Ar-CO-CO2- O2 gas mixes (10-18 to 10-0.7 bar). Five starting compositions were used to ensure the presence of all the desired phases. Experimental products were analysed for major elements by electron microprobe and for trace elements by laser-ablation ICP-MS, which enables V to be measured precisely even at very low concentrations. Partition coefficients for all phases plot as approximately sigmoid-shaped curves in log D-log fO2 space. Details of the shape of the curve are controlled by the relative preference of each crystalline phase for a specific valence of V. For instance, orthopyroxene appears to particularly like V4+, so that the log DVopx/melt-log fO2 and log DVcpx/melt-log fO2 curves converge in the region of the diagram dominated by V4+, diverging in the regions dominated by V3+ and V5+. Contrary to previous studies, our results do not suggest a systematic increase in DVcpx/opx with decreasing fO2. Olivine and spinel, on the other hand, strongly prefer V3+ relative to V4+ and V5+ and hence for olivine and spinel the difference in partition coefficients between reducing and oxidizing conditions are more pronounced than that for pyroxenes. At high-pressure, DVgrt/melt and DVcpx/melt are very similar to each other, but the values of DVcpx/melt are about one order of magnitude higher than those obtained at 1 atm at comparable fO2. The cause of this discrepancy is being investigated.

  1. Cumulative effects of forest management activities: how might they occur?

    Treesearch

    R. M. Rice; R. B. Thomas

    1985-01-01

    Concerns are often voiced about possible environmental damage as the result of the cumulative sedimentation effects of logging and forest road construction. In response to these concerns, National Forests are developing procedures to reduce the possibility that their activities may lead to unacceptable cumulative effects

  2. Parametric vs. non-parametric statistics of low resolution electromagnetic tomography (LORETA).

    PubMed

    Thatcher, R W; North, D; Biver, C

    2005-01-01

    This study compared the relative statistical sensitivity of non-parametric and parametric statistics of 3-dimensional current sources as estimated by the EEG inverse solution Low Resolution Electromagnetic Tomography (LORETA). One would expect approximately 5% false positives (classification of a normal as abnormal) at the P < .025 level of probability (two tailed test) and approximately 1% false positives at the P < .005 level. EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) from 43 normal adult subjects were imported into the Key Institute's LORETA program. We then used the Key Institute's cross-spectrum and the Key Institute's LORETA output files (*.lor) as the 2,394 gray matter pixel representation of 3-dimensional currents at different frequencies. The mean and standard deviation *.lor files were computed for each of the 2,394 gray matter pixels for each of the 43 subjects. Tests of Gaussianity and different transforms were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of parametric vs. non-parametric statistics were compared using a "leave-one-out" cross validation method in which individual normal subjects were withdrawn and then statistically classified as being either normal or abnormal based on the remaining subjects. Log10 transforms approximated Gaussian distribution in the range of 95% to 99% accuracy. Parametric Z score tests at P < .05 cross-validation demonstrated an average misclassification rate of approximately 4.25%, and range over the 2,394 gray matter pixels was 27.66% to 0.11%. At P < .01 parametric Z score cross-validation false positives were 0.26% and ranged from 6.65% to 0% false positives. The non-parametric Key Institute's t-max statistic at P < .05 had an average misclassification error rate of 7.64% and ranged from 43.37% to 0.04% false positives. The nonparametric t-max at P < .01 had an average misclassification rate of 6.67% and ranged from 41.34% to 0% false positives of the 2,394 gray matter pixels for any cross-validated normal subject. In conclusion, adequate approximation to Gaussian distribution and high cross-validation can be achieved by the Key Institute's LORETA programs by using a log10 transform and parametric statistics, and parametric normative comparisons had lower false positive rates than the non-parametric tests.

  3. PAH bioconcentration in Mytilus sp from Sinclair Inlet, WA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frazier, J.; Young, D.; Ozretich, R.

    1995-12-31

    Approximately 20 polynuclear aromatic hydrocarbons (PAH) were measured by GC/MS in seawater and whole soft tissues of the intertidal mussel Mytilus sp. collected in July 1991 within and around Puget Sound`s Sinclair Inlet. Low variability was observed in the water concentrations collected over three days at control sites, yielding reliable values for the exposure levels experienced by this bioindicator mollusk. Mean water concentrations of acenaphthene, phenanthrene, and fluoranthene in the control region were 2.7 {+-} 0.8, 2.8 {+-} 0.8, and 3.1 {+-} 0.7 ng/liter, respectively. Levels measured near sites of vessel activity were higher but much more variable; this reducedmore » the reliability of the tissue/water bioconcentration factors (BCF) obtained from these samples. An empirical model relating values of Log BCF and Log Kow for the control zone samples supports the utility of this estuarine bioindicator for monitoring general levels of PAH in nearshore surface waters.« less

  4. Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods

    PubMed Central

    Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.

    2017-01-01

    The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537

  5. The existence and abundance of ghost ancestors in biparental populations.

    PubMed

    Gravel, Simon; Steel, Mike

    2015-05-01

    In a randomly-mating biparental population of size N there are, with high probability, individuals who are genealogical ancestors of every extant individual within approximately log2(N) generations into the past. We use this result of J. Chang to prove a curious corollary under standard models of recombination: there exist, with high probability, individuals within a constant multiple of log2(N) generations into the past who are simultaneously (i) genealogical ancestors of each of the individuals at the present, and (ii) genetic ancestors to none of the individuals at the present. Such ancestral individuals-ancestors of everyone today that left no genetic trace-represent 'ghost' ancestors in a strong sense. In this short note, we use simple analytical argument and simulations to estimate how many such individuals exist in finite Wright-Fisher populations. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. The missing impact craters on Venus

    NASA Technical Reports Server (NTRS)

    Speidel, D. H.

    1993-01-01

    The size-frequency pattern of the 842 impact craters on Venus measured to date can be well described (across four standard deviation units) as a single log normal distribution with a mean crater diameter of 14.5 km. This result was predicted in 1991 on examination of the initial Magellan analysis. If this observed distribution is close to the real distribution, the 'missing' 90 percent of the small craters and the 'anomalous' lack of surface splotches may thus be neither missing nor anomalous. I think that the missing craters and missing splotches can be satisfactorily explained by accepting that the observed distribution approximates the real one, that it is not craters that are missing but the impactors. What you see is what you got. The implication that Venus crossing impactors would have the same type of log normal distribution is consistent with recently described distribution for terrestrial craters and Earth crossing asteroids.

  7. Renyi entropy for local quenches in 2D CFT from numerical conformal blocks

    NASA Astrophysics Data System (ADS)

    Kusuki, Yuya; Takayanagi, Tadashi

    2018-01-01

    We study the time evolution of Renyi entanglement entropy for locally excited states in two dimensional large central charge CFTs. It generically shows a logarithmical growth and we compute the coefficient of log t term. Our analysis covers the entire parameter regions with respect to the replica number n and the conformal dimension h O of the primary operator which creates the excitation. We numerically analyse relevant vacuum conformal blocks by using Zamolodchikov's recursion relation. We find that the behavior of the conformal blocks in two dimensional CFTs with a central charge c, drastically changes when the dimensions of external primary states reach the value c/32. In particular, when h O ≥ c/32 and n ≥ 2, we find a new universal formula Δ {S}_A^{(n)}˜eq nc/24(n-1) log t. Our numerical results also confirm existing analytical results using the HHLL approximation.

  8. Disinfection of contaminated water by using solar irradiation.

    PubMed

    Caslake, Laurie F; Connolly, Daniel J; Menon, Vilas; Duncanson, Catriona M; Rojas, Ricardo; Tavakoli, Javad

    2004-02-01

    Contaminated water causes an estimated 6 to 60 billion cases of gastrointestinal illness annually. The majority of these cases occur in rural areas of developing nations where the water supply remains polluted and adequate sanitation is unavailable. A portable, low-cost, and low-maintenance solar unit to disinfect unpotable water has been designed and tested. The solar disinfection unit was tested with both river water and partially processed water from two wastewater treatment plants. In less than 30 min in midday sunlight, the unit eradicated more than 4 log10 U (99.99%) of bacteria contained in highly contaminated water samples. The solar disinfection unit has been field tested by Centro Panamericano de Ingenieria Sanitaria y Ciencias del Ambiente in Lima, Peru. At moderate light intensity, the solar disinfection unit was capable of reducing the bacterial load in a controlled contaminated water sample by 4 log10 U and disinfected approximately 1 liter of water in 30 min.

  9. Effects of solvent composition in the normal-phase liquid chromatography of alkylphenols and naphthols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurtubise, R.J.; Hussain, A.; Silver, H.F.

    1981-11-01

    The normal-phase liquid chromatographic models of Scott, Snyder, and Soczewinski were considered for a ..mu..-Bondapak NH/sub 2/ stationary phase. n-Heptane:2-propanol and n-heptane:ethyl acetate mobile phases of different compositions were used. Linear relationships were obtained from graphs of log K' vs. log mole fraction of the strong solvent for both n-heptane:2-propanol and n-heptane:ethyl acetate mobile phases. A linear relationship was obtained between the reciprocal of corrected retention volume and % wt/v of 2-propanol but not between the reciprocal of corrected retention volume and % wt/v of ethyl acetate. The slopes and intercept terms from the Snyder and Soczewinski models were foundmore » to approximately describe interactions with ..mu..-Bondapak NH/sub 2/. Capacity factors can be predicted for the compounds by using the equations obtained from mobile phase composition variation experiments.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dechant, Lawrence J.

    We examine the role of periodic sinusoidal free-stream disturbances on the inner law law-of-the-wall (log-law) for turbulent boundary layers. This model serves a surrogate for the interaction of flight vehicles with atmospheric disturbances. The approximate skin friction expression that is derived suggests that free-stream disturbances can cause enhancement of the mean skin friction. Considering the influence of grid generated free stream turbulence in the laminar sublayer/log law region (small scale/high frequency) the model recovers the well-known shear layer enhancement suggesting an overall validity for the approach. The effect on the wall shear associated with the lower frequency due to themore » passage of the vehicle through large (vehicle scale) atmospheric disturbances is likely small i.e. on the order 1% increase for turbulence intensities on the order of 2%. The increase in wall pressure fluctuation which is directly proportional to the wall shear stress is correspondingly small.« less

  11. Low-dimensional Representation of Error Covariance

    NASA Technical Reports Server (NTRS)

    Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan

    2000-01-01

    Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.

  12. Assessment of the body burden of chelatable lead: a model and its application to lead workers.

    PubMed Central

    Araki, S; Ushio, K

    1982-01-01

    A hypothetical model was introduced to estimate the body burden of chelatable lead from the mobilisation yield of lead by calcium disodium ethylenediamine tetra-acetate (CaEDTA). It was estimated that, on average, 14 and 19% of the body burden was mobilised into the urine during the 24 hours after an injection of 53.4 mumol (20 mg) and 107 mumol (40 mg) CaEDTA per kg bodyweight, respectively. The body burden of chelatable lead ranged from 4 mumol (0.8 mg) to 120 mumol (24.9 mg) (mean 37 mumol (7.7 mg) in lead workers with blood lead concentrations of 0.3-2.9 mumol/kg (6-60 microgram/100 g) (mean 1.4 mumol/kg (29 microgram/100 g)). There were linear relationships between blood lead concentrations and body burden of chelatable lead on a log scale. PMID:6802167

  13. A 300 year history of lead contamination in northern French Alps reconstructed from distant lake sediment records.

    PubMed

    Arnaud, F; Revel-Rolland, M; Bosch, D; Winiarski, T; Desmet, M; Tribovillard, N; Givelet, N

    2004-05-01

    Lead concentrations and isotopic ratios were measured along two well-dated sediment cores from two distant lakes: Anterne (2100 m a.s.l.) and Le Bourget (270 m a.s.l.), submitted to low and high direct human impact and covering the last 250 and 600 years, respectively. The measurement of lead in old sediment samples (>3000 BP) permits, in using mixing-models, the determination of lead concentration, flux and isotopic composition of purely anthropogenic origin. We thus show that since ca. 1800 AD the regional increase in lead contamination was mostly driven by coal consumption ((206)Pb/(207)Pb approximately 1.17-1.19; (206)Pb/(204)Pb approximately 18.3-18.6), which peaks around 1915 AD. The increasing usage of leaded gasoline, introduced in the 1920s, was recorded in both lakes by increasing Pb concentrations and decreasing Pb isotope ratios. A peak around 1970 ((206)Pb/(207)Pb approximately 1.13-1.16; (206)Pb/(204)Pb approximately 17.6-18.0) corresponds to the worldwide recorded leaded gasoline maximum of consumption. The 1973 oil crisis is characterised by a drastic drop of lead fluxes in both lakes (from approximately 35 to <20 mg cm(-2) yr(-1)). In the late 1980s, environmental policies made the Lake Anterne flux drop to pre-1900 values (<10 mg cm(-2) yr(-1)) while Lake Le Bourget is always submitted to an important flux (approximately 25 mg cm(-2) yr(-1)). The good match of our distant records, together and with a previously established series in an ice core from Mont Blanc, provides confidence in the use of sediments as archives of lead contamination. The integration of the Mont Blanc ice core results from Rosman et al. with our data highlights, from 1990 onward, a decoupling in lead sources between the high elevation sites (Lake Anterne and Mont Blanc ice core), submitted to a mixture of long-distance and regional contamination and the low elevation site (Lake Le Bourget), where regional contamination is predominant.

  14. Gravity, aeromagnetic and rock-property data of the central California Coast Ranges

    USGS Publications Warehouse

    Langenheim, V.E.

    2014-01-01

    Gravity, aeromagnetic, and rock-property data were collected to support geologic-mapping, water-resource, and seismic-hazard studies for the central California Coast Ranges. These data are combined with existing data to provide gravity, aeromagnetic, and physical-property datasets for this region. The gravity dataset consists of approximately 18,000 measurements. The aeromagnetic dataset consists of total-field anomaly values from several detailed surveys that have been merged and gridded at an interval of 200 m. The physical property dataset consists of approximately 800 density measurements and 1,100 magnetic-susceptibility measurements from rock samples, in addition to previously published borehole gravity surveys from Santa Maria Basin, density logs from Salinas Valley, and intensities of natural remanent magnetization.

  15. Pairing in a dry Fermi sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Thomas A.; Staar, Peter; Mishra, V.

    In the traditional Bardeen–Cooper–Schrieffer theory of superconductivity, the amplitude for the propagation of a pair of electrons with momentum k and -k has a log singularity as the temperature decreases. This so-called Cooper instability arises from the presence of an electron Fermi sea. It means that an attractive interaction, no matter how weak, will eventually lead to a pairing instability. However, in the pseudogap regime of the cuprate superconductors, where parts of the Fermi surface are destroyed, this log singularity is suppressed, raising the question of how pairing occurs in the absence of a Fermi sea. In this paper, wemore » report Hubbard model numerical results and the analysis of angular-resolved photoemission experiments on a cuprate superconductor. Finally, in contrast to the traditional theory, we find that in the pseudogap regime the pairing instability arises from an increase in the strength of the spin–fluctuation pairing interaction as the temperature decreases rather than the Cooper log instability.« less

  16. Pairing in a dry Fermi sea

    DOE PAGES

    Maier, Thomas A.; Staar, Peter; Mishra, V.; ...

    2016-06-17

    In the traditional Bardeen–Cooper–Schrieffer theory of superconductivity, the amplitude for the propagation of a pair of electrons with momentum k and -k has a log singularity as the temperature decreases. This so-called Cooper instability arises from the presence of an electron Fermi sea. It means that an attractive interaction, no matter how weak, will eventually lead to a pairing instability. However, in the pseudogap regime of the cuprate superconductors, where parts of the Fermi surface are destroyed, this log singularity is suppressed, raising the question of how pairing occurs in the absence of a Fermi sea. In this paper, wemore » report Hubbard model numerical results and the analysis of angular-resolved photoemission experiments on a cuprate superconductor. Finally, in contrast to the traditional theory, we find that in the pseudogap regime the pairing instability arises from an increase in the strength of the spin–fluctuation pairing interaction as the temperature decreases rather than the Cooper log instability.« less

  17. Apparent Transition in the Human Height Distribution Caused by Age-Dependent Variation during Puberty Period

    NASA Astrophysics Data System (ADS)

    Iwata, Takaki; Yamazaki, Yoshihiro; Kuninaka, Hiroto

    2013-08-01

    In this study, we examine the validity of the transition of the human height distribution from the log-normal distribution to the normal distribution during puberty, as suggested in an earlier study [Kuninaka et al.: J. Phys. Soc. Jpn. 78 (2009) 125001]. Our data analysis reveals that, in late puberty, the variation in height decreases as children grow. Thus, the classification of a height dataset by age at this stage leads us to analyze a mixture of distributions with larger means and smaller variations. This mixture distribution has a negative skewness and is consequently closer to the normal distribution than to the log-normal distribution. The opposite case occurs in early puberty and the mixture distribution is positively skewed, which resembles the log-normal distribution rather than the normal distribution. Thus, this scenario mimics the transition during puberty. Additionally, our scenario is realized through a numerical simulation based on a statistical model. The present study does not support the transition suggested by the earlier study.

  18. Contributing factors to VEP grating acuity deficit and inter-ocular acuity difference in children with cerebral visual impairment.

    PubMed

    Cavascan, Nívea Nunes; Salomão, Solange Rios; Sacai, Paula Yuri; Pereira, Josenilson Martins; Rocha, Daniel Martins; Berezovsky, Adriana

    2014-04-01

    To investigate contributing factors to visual evoked potential (VEP) grating acuity deficit (GAD) and inter-ocular acuity difference (IAD) measured by sweep-VEPs in children with cerebral visual impairment (CVI). VEP GAD was calculated for the better acuity eye by subtracting acuity thresholds from mean normal VEP grating acuity according to norms from our own laboratory. Deficits were categorized as mild (0.17 ≤ deficit < 0.40 log units), moderate (0.40 ≤ deficit < 0.70 log units) or severe (deficit ≥0.70 log units). Maximum acceptable IAD was 0.10 log units. A group of 115 children (66 males-57 %) with ages ranging from 1.2 to 166.5 months (median = 17.7) was examined. VEP GAD ranged from 0.17 to 1.28 log units (mean = 0.68 ± 0.27; median = 0.71), and it was mild in 23 (20 %) children, moderate in 32 (28 %) and severe in 60 (52 %). Severe deficit was significantly associated with older age and anti-seizure drug therapy. IAD ranged from 0 to 0.49 log units (mean = 0.06 ± 0.08; median = 0.04) and was acceptable in 96 (83 %) children. Children with strabismus and nystagmus had IAD significantly larger compared to children with orthoposition. In a large cohort of children with CVI, variable severity of VEP GAD was found, with more than half of the children with severe deficits. Older children and those under anti-seizure therapy were at higher risk for larger deficits. Strabismus and nystagmus provided larger IADs. These results should be taken into account on the clinical management of children with this leading cause of bilateral visual impairment.

  19. Using Remote Sensed Imagery to Determine the Impacts from Salvage Logging after the 2015 Tower Fire, Washington (USA)

    NASA Astrophysics Data System (ADS)

    Broers, Anna; Robichaud, Peter; Lewis, Sarah

    2017-04-01

    Wildfires are part of the natural process in most forested landscapes and during subsequent precipitation, the runoff and consequently erosion of the soil increases. Several factors contribute to the increased runoff: loss of runoff storage in the forest floor, the water repellent soil layer and reduced interception by the canopy. Due to climate change, the number of wildfires and their severity is likely to increase, which will lead to increased erosion; this has been investigated by others. Often, land management protocol is to remove the standing dead trees before they decay. In the past years salvage logging has received more attention in research, yet results have been mixed on its effects on increased erosion. The goal of the current research is to determine the change in surface conditions due to salvage logging operations by comparing the pre- and post-fire and post-salvage surface conditions. To determine this change, high resolution WorldView remote sensing imagery was used after 9000-ha 2015 Tower Fire which was located on the border of Idaho and Washington (USA). Ground validation measurements were taken using the forest soil disturbance protocol as well as GPS coordinates and measurements of highly disturbed areas such as skid trails, skyline drag lines and other machinery impacts. Some correlations were found between disturbance classes, bare soil, exposed wheel tracks (rutting) and soil compaction. High resolution WorldView remote sensing images detected changes in the pre- and post-fire environmental conditions and the change due to salvage logging operations. Classifying disturbances using remote sensing imagery is complicated by natural revegetation processes and by the timing of salvage logging operations. Initial results suggest that high resolution imagery can be used to determine onsite impacts of salvage logging operations.

  20. ACGME case logs: Surgery resident experience in operative trauma for two decades

    PubMed Central

    Drake, Frederick Thurston; Van Eaton, Erik G.; Huntington, Ciara R.; Jurkovich, Gregory J.; Aarabi, Shahram; Gow, Kenneth W.

    2014-01-01

    BACKGROUND Surgery resident education is based on experiential training, which is influenced by changes in clinical management strategies, technical and technologic advances, and administrative regulations. Trauma care has been exposed to each of these factors, prompting concerns about resident experience in operative trauma. The current study analyzed the reported volume of operative trauma for the last two decades; to our knowledge, this is the first evaluation of nationwide trends during such an extended time line. METHODS The Accreditation Council for Graduate Medical Education (ACGME) database of operative logs was queried from academic year (AY) 1989–1990 to 2009–2010 to identify shifts in trauma operative experience. Annual case log data for each cohort of graduating surgery residents were combined into approximately 5-year blocks, designated Period I (AY1989–1990 to AY1993–1994), Period II (AY1994–1995 to AY1998–1999), Period III (AY1999–2000 to AY2002–2003), and Period IV (AY2003–2004 to AY2009–2010). The latter two periods were delineated by the year in which duty hour restrictions were implemented. RESULTS Overall general surgery caseload increased from Period I to Period II (p < 0.001), remained stable from Period II to Period III, and decreased from Period III to Period IV (p < 0.001). However, for ACGME-designated trauma cases, there were significant declines from Period I to Period II (75.5 vs. 54.5 cases, p < 0.001) and Period II to Period III (54.5 vs. 39.3 cases, p < 0.001) but no difference between Period III and Period IV (39.3 vs. 39.4 cases). Graduating residents in Period I performed, on average, 31 intra-abdominal trauma operations, including approximately five spleen and four liver operations. Residents in Period IV performed 17 intra-abdominal trauma operations, including three spleen and approximately two liver operations. CONCLUSION Recent general surgery trainees perform fewer trauma operations than previous trainees. The majority of this decline occurred before implementation of work-hour restrictions. Although these changes reflect concurrent changes in management of trauma, surgical educators must meet the challenge of training residents in procedures less frequently performed. LEVEL OF EVIDENCE Epidemiologic study, level III; therapeutic study, level IV. PMID:23188243

  1. Relationships between Watershed Alterations and Sediment Accretion Rates in Willapa Bay Washington and Yaquina Bay, Oregon

    EPA Science Inventory

    The Pacific Northwest (PNW) is one of the leading regions of timber production in the United States. It also undergoes aperiodic episodes of catastrophic forest fires, and systematic slash burns following logging activities. Such conditions raise concerns regarding increased re...

  2. REGIONAL ASSESSMENT OF LAND USE IMPACTS ON STREAM CHANNEL HABITAT IN THE MIDDLE COLUMBIA RIVER BASIN

    EPA Science Inventory

    Many human land uses and land cover modifications (e.g., logging, grazing, roads) tend to increase erosion, leading to an increase in fine sediment supplied to streams and potentially degrading aquatic habitat for benthic organisms. This study evaluated potential human impacts o...

  3. The Plant Ionome Revisited by the Nutrient Balance Concept

    PubMed Central

    Parent, Serge-Étienne; Parent, Léon Etienne; Egozcue, Juan José; Rozane, Danilo-Eduardo; Hernandes, Amanda; Lapointe, Line; Hébert-Gentile, Valérie; Naess, Kristine; Marchand, Sébastien; Lafond, Jean; Mattos, Dirceu; Barlow, Philip; Natale, William

    2013-01-01

    Tissue analysis is commonly used in ecology and agronomy to portray plant nutrient signatures. Nutrient concentration data, or ionomes, belong to the compositional data class, i.e., multivariate data that are proportions of some whole, hence carrying important numerical properties. Statistics computed across raw or ordinary log-transformed nutrient data are intrinsically biased, hence possibly leading to wrong inferences. Our objective was to present a sound and robust approach based on a novel nutrient balance concept to classify plant ionomes. We analyzed leaf N, P, K, Ca, and Mg of two wild and six domesticated fruit species from Canada, Brazil, and New Zealand sampled during reproductive stages. Nutrient concentrations were (1) analyzed without transformation, (2) ordinary log-transformed as commonly but incorrectly applied in practice, (3) additive log-ratio (alr) transformed as surrogate to stoichiometric rules, and (4) converted to isometric log-ratios (ilr) arranged as sound nutrient balance variables. Raw concentration and ordinary log transformation both led to biased multivariate analysis due to redundancy between interacting nutrients. The alr- and ilr-transformed data provided unbiased discriminant analyses of plant ionomes, where wild and domesticated species formed distinct groups and the ionomes of species and cultivars were differentiated without numerical bias. The ilr nutrient balance concept is preferable to alr, because the ilr technique projects the most important interactions between nutrients into a convenient Euclidean space. This novel numerical approach allows rectifying historical biases and supervising phenotypic plasticity in plant nutrition studies. PMID:23526060

  4. Interactive effects of historical logging and fire exclusion on ponderosa pine forest structure in the northern Rockies.

    PubMed

    Naficy, Cameron; Sala, Anna; Keeling, Eric G; Graham, Jon; DeLuca, Thomas H

    2010-10-01

    Increased forest density resulting from decades of fire exclusion is often perceived as the leading cause of historically aberrant, severe, contemporary wildfires and insect outbreaks documented in some fire-prone forests of the western United States. Based on this notion, current U.S. forest policy directs managers to reduce stand density and restore historical conditions in fire-excluded forests to help minimize high-severity disturbances. Historical logging, however, has also caused widespread change in forest vegetation conditions, but its long-term effects on vegetation structure and composition have never been adequately quantified. We document that fire-excluded ponderosa pine forests of the northern Rocky Mountains logged prior to 1960 have much higher average stand density, greater homogeneity of stand structure, more standing dead trees and increased abundance of fire-intolerant trees than paired fire-excluded, unlogged counterparts. Notably, the magnitude of the interactive effect of fire exclusion and historical logging substantially exceeds the effects of fire exclusion alone. These differences suggest that historically logged sites are more prone to severe wildfires and insect outbreaks than unlogged, fire-excluded forests and should be considered a high priority for fuels reduction treatments. Furthermore, we propose that ponderosa pine forests with these distinct management histories likely require distinct restoration approaches. We also highlight potential long-term risks of mechanical stand manipulation in unlogged forests and emphasize the need for a long-term view of fuels management.

  5. Chemical analysis of water samples and geophysical logs from cored test holes drilled in the central Oklahoma Aquifer, Oklahoma

    USGS Publications Warehouse

    Schlottmann, Jamie L.; Funkhouser, Ron A.

    1991-01-01

    Chemical analyses of water from eight test holes and geophysical logs for nine test holes drilled in the Central Oklahoma aquifer are presented. The test holes were drilled to investigate local occurrences of potentially toxic, naturally occurring trace substances in ground water. These trace substances include arsenic, chromium, selenium, residual alpha-particle activities, and uranium. Eight of the nine test holes were drilled near wells known to contain large concentrations of one or more of the naturally occurring trace substances. One test hole was drilled in an area known to have only small concentrations of any of the naturally occurring trace substances.Water samples were collected from one to eight individual sandstone layers within each test hole. A total of 28 water samples, including four duplicate samples, were collected. The temperature, pH, specific conductance, alkalinity, and dissolved-oxygen concentrations were measured at the sample site. Laboratory determinations included major ions, nutrients, dissolved organic carbon, and trace elements (aluminum, arsenic, barium, beryllium, boron, cadmium, chromium, hexavalent chromium, cobalt, copper, iron, lead, lithium, manganese, mercury, molybdenum, nickel, selenium, silver, strontium, vanadium and zinc). Radionuclide activities and stable isotope (5 values also were determined, including: gross-alpha-particle activity, gross-beta-particle activity, radium-226, radium-228, radon-222, uranium-234, uranium-235, uranium-238, total uranium, carbon-13/carbon-12, deuterium/hydrogen-1, oxygen-18/oxygen-16, and sulfur-34/sulfur-32. Additional analyses of arsenic and selenium species are presented for selected samples as well as analyses of density and iodine for two samples, tritium for three samples, and carbon-14 for one sample.Geophysical logs for most test holes include caliper, neutron, gamma-gamma, natural-gamma logs, spontaneous potential, long- and short-normal resistivity, and single-point resistance. Logs for test-hole NOTS 7 do not include long- and short-normal resistivity, spontaneous-potential, or single-point resistivity. Logs for test-hole NOTS 7A include only caliper and natural-gamma logs.

  6. Vortex boundary-layer interactions

    NASA Technical Reports Server (NTRS)

    Bradshaw, P.

    1986-01-01

    Parametric studies to identify a vortex generator were completed. Data acquisition in the first chosen configuration, in which a longitudinal vortex pair generated by an isolated delta wing starts to merge with a turbulent boundary layer on a flat plate fairly close to the leading edge is nearly completed. Work on a delta-wing/flat-plate combination, consisting of a flow visualization and hot wire measurements taken with a computer controlled traverse gear and data logging system were completed. Data taking and analysis have continued, and sample results for another cross stream plane are presented. Available data include all mean velocity components, second order mean products of turbulent fluctuations, and third order mean products. Implementation of a faster data logging system was accomplished.

  7. A photoionization instability in the early intergalactic medium

    NASA Technical Reports Server (NTRS)

    Hogan, Craig J.

    1992-01-01

    It is argued that any fairly uniform source of ionizing photons can be the cause of an instability in the pregalactic medium on scales larger than a photon path length. Underdense regions receive more ionizing energy per atom and reach higher temperature and entropy, driving the density down still further. Fluctuations created by this instability can lead to the formation of structures resembling protogalaxies and intergalactic clouds, obviating the need for gas clouds or density perturbations of earlier cosmological provenance, as is usually assumed in theories of galaxy and structure formation. Characteristic masses for clouds produced by the instability, with log mass in solar units plotted against log radius in kpc, are illustrated.

  8. A Tidal Disruption Event in a Nearby Galaxy Hosting an Intermediate Mass Black Hole

    NASA Technical Reports Server (NTRS)

    Donato, D; Cenko, S. B.; Covino, S.; Troja, E.; Pursimo, T.; Cheung, C. C.; Fox, O.; Kutyrev, A.; Campana, S.; Fugazza, D.; hide

    2014-01-01

    We report the serendipitous discovery of a bright point source flare in the Abell cluster A1795 with archival EUVE and Chandra observations. Assuming the EUVE emission is associated with the Chandra source, the X-ray 0.5-7 kiloelectronvolt flux declined by a factor of approximately 2300 over a time span of 6 years, following a power-law decay with index approximately equal to 2.44 plus or minus 0.40. The Chandra data alone vary by a factor of approximately 20. The spectrum is well fit by a blackbody with a constant temperature of kiloteslas approximately equal to 0.09 kiloelectronvolts (approximately equal to 10 (sup 6) Kelvin). The flare is spatially coincident with the nuclear region of a faint, inactive galaxy with a photometric redshift consistent at the 1 sigma level with the cluster (redshift = 0.062476).We argue that these properties are indicative of a tidal disruption of a star by a black hole (BH) with log(M (sub BH) / M (sub 1 solar mass)) approximately equal to 5.5 plus or minus 0.5. If so, such a discovery indicates that tidal disruption flares may be used to probe BHs in the intermediate mass range, which are very difficult to study by other means.

  9. Polynomial probability distribution estimation using the method of moments

    PubMed Central

    Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949

  10. Polynomial probability distribution estimation using the method of moments.

    PubMed

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  11. Combinatorial approximation algorithms for MAXCUT using random walks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seshadhri, Comandur; Kale, Satyen

    We give the first combinatorial approximation algorithm for MaxCut that beats the trivial 0.5 factor by a constant. The main partitioning procedure is very intuitive, natural, and easily described. It essentially performs a number of random walks and aggregates the information to provide the partition. We can control the running time to get an approximation factor-running time tradeoff. We show that for any constant b > 1.5, there is an {tilde O}(n{sup b}) algorithm that outputs a (0.5 + {delta})-approximation for MaxCut, where {delta} = {delta}(b) is some positive constant. One of the components of our algorithm is a weakmore » local graph partitioning procedure that may be of independent interest. Given a starting vertex i and a conductance parameter {phi}, unless a random walk of length {ell} = O(log n) starting from i mixes rapidly (in terms of {phi} and {ell}), we can find a cut of conductance at most {phi} close to the vertex. The work done per vertex found in the cut is sublinear in n.« less

  12. Speech Data Analysis for Semantic Indexing of Video of Simulated Medical Crises

    DTIC Science & Technology

    2015-05-01

    scheduled approximately twice per week and are recorded as video data. During each session, the physician/instructor must manually review and anno - tate...spectrum, y, using regression line: y = ln(1 + Jx), (2.3) where x is the auditory power spectral amplitude, J is a singal-dependent pos- itive constant...The amplitude-warping transform is linear-like for J 1 and logarithmic-like for J 1. 3. RASTA filtering: reintegrate the log critical-band

  13. Carbon Pool Dynamics in the Lower Fraser Basin from 1827 to 1990

    PubMed

    Boyle; Lavkulich

    1997-05-01

    / To understand the total impact of humans on the carbon cycle, themodeling and quantifying of the transfer of carbon from terrestrial pools tothe atmosphere is becoming more critical. Using previously published data,this research sought to assess the change in carbon pools caused by humans inthe Lower Fraser Basin (LFB) in British Columbia, Canada, since 1827 anddefine the long-term, regional contribution of carbon to the atmosphere. Theresults indicate that there has been a transfer of 270 Mt of carbon frombiomass pools in the LFB to other pools, primarily the atmosphere. The majorlosses of biomass carbon have been from logged forests (42%), wetlands(14%), and soils (43%). Approximately 48% of the forestbiomass, almost 20% of the carbon of the LFB, lies within old-growthforest, which covers only 19% of the study area. Landfills are nowbecoming a major sink of carbon, containing 5% of the biomass carbonin the LFB, while biomass carbon in buildings, urban vegetation, mammals, andagriculture is negligible. Approximately 26% of logged forest biomasswould still be in a terrestrial biomass pool, leaving 238 Mt of carbon thathas been released to the atmosphere. On an area basis, this is 29 times theaverage global emissions of carbon, providing an indication of the pastcontributions of developed countries such as Canada to global warming andpossible contributions from further clearing of rainforest in both tropicaland temperate regions.KEY WORDS: Carbon pools; Global warming; Carbon release to atmosphere;Greenhouse effect

  14. Detecting Microbially Induced Calcite Precipitation in a Model Well-Bore Using Downhole Low-Field NMR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirkland, Catherine M.; Zanetti, Sam; Grunewald, Elliot

    Microbially induced calcite precipitation (MICP) has been widely researched recently due to its relevance for subsurface engineering applications including sealing leakage pathways and permeability modification. These applications of MICP are inherently difficult to monitor nondestructively in time and space. Nuclear magnetic resonance (NMR) can characterize the pore size distributions, porosity, and permeability of subsurface formations. This investigation used a low-field NMR well-logging probe to monitor MICP in a sand-filled bioreactor, measuring NMR signal amplitude and T 2 relaxation over an 8 day experimental period. Following inoculation with the ureolytic bacteria, Sporosarcina pasteurii, and pulsed injections of urea and calcium substrate,more » the NMR measured water content in the reactor decreased to 76% of its initial value. T 2 relaxation distributions bifurcated from a single mode centered about approximately 650 ms into a fast decaying population ( T 2 less than 10 ms) and a larger population with T 2 greater than 1000 ms. The combination of changes in pore volume and surface minerology accounts for the changes in the T 2 distributions. Destructive sampling confirmed final porosity was approximately 88% of the original value. Here, these results indicate the low-field NMR well-logging probe is sensitive to the physical and chemical changes caused by MICP in a laboratory bioreactor.« less

  15. Detecting Microbially Induced Calcite Precipitation in a Model Well-Bore Using Downhole Low-Field NMR

    DOE PAGES

    Kirkland, Catherine M.; Zanetti, Sam; Grunewald, Elliot; ...

    2016-12-20

    Microbially induced calcite precipitation (MICP) has been widely researched recently due to its relevance for subsurface engineering applications including sealing leakage pathways and permeability modification. These applications of MICP are inherently difficult to monitor nondestructively in time and space. Nuclear magnetic resonance (NMR) can characterize the pore size distributions, porosity, and permeability of subsurface formations. This investigation used a low-field NMR well-logging probe to monitor MICP in a sand-filled bioreactor, measuring NMR signal amplitude and T 2 relaxation over an 8 day experimental period. Following inoculation with the ureolytic bacteria, Sporosarcina pasteurii, and pulsed injections of urea and calcium substrate,more » the NMR measured water content in the reactor decreased to 76% of its initial value. T 2 relaxation distributions bifurcated from a single mode centered about approximately 650 ms into a fast decaying population ( T 2 less than 10 ms) and a larger population with T 2 greater than 1000 ms. The combination of changes in pore volume and surface minerology accounts for the changes in the T 2 distributions. Destructive sampling confirmed final porosity was approximately 88% of the original value. Here, these results indicate the low-field NMR well-logging probe is sensitive to the physical and chemical changes caused by MICP in a laboratory bioreactor.« less

  16. A field study of virus removal in septic tank drainfields.

    PubMed

    Nicosia, L A; Rose, J B; Stark, L; Stewart, M T

    2001-01-01

    Two field studies were conducted at a research station in Tampa, Florida to assess the removal of bacteriophage PRD1 from wastewater in septic tank drainfields. Infiltration cells were seeded with PRD1 and bromide and the effects of effluent hydraulic loading rate and rainfall on virus removal were monitored. Septic tank effluent samples were collected after passage through 0.6 m of unsaturated fine sand and PRD1 was detected over an average of 67 d. Bacteriophage PRD1 breakthrough was detected at approximately the same time as bromide in all three cells except for the low-load cell (Study 1), where bromide was never detected. Log10 removals of PRD1 were 1.43 and 1.91 for the high-load cells (hydraulic loading rate = 0.063 m/d) and 2.21 for the low-load cell (hydraulic loading rate = 0.032 m/d). Virus attenuation is attributed to dispersion, dilution, and inactivation. Significant increases in PRD1 elution with rainfall were observed in the first 10 d of the study. Approximately 125 mm of rainfall caused a 1.2 log10 increase of PRD1 detected at the 0.6-m depth. Current Florida onsite wastewater disposal standards, which specify a 0.6-m distance from the drainfield to the water table, may not provide sufficient removal of viruses, particularly during the wet season.

  17. Formulation studies of InhA inhibitors and combination therapy to improve efficacy against Mycobacterium tuberculosis.

    PubMed

    Knudson, Susan E; Cummings, Jason E; Bommineni, Gopal R; Pan, Pan; Tonge, Peter J; Slayden, Richard A

    2016-12-01

    Previously, structure-based drug design was used to develop substituted diphenyl ethers with potency against the Mycobacterium tuberculosis (Mtb) enoyl-ACP reductase (InhA), however, the highly lipophilic centroid compound, SB-PT004, lacked sufficient efficacy in the acute murine Mtb infection model. A next generation series of compounds were designed with improved specificity, potency against InhA, and reduced cytotoxicity in vitro, but these compounds also had limited solubility. Accordingly, solubility and pharmacokinetics studies were performed to develop formulations for this class and other experimental drug candidates with high logP values often encountered in drug discovery. Lead diphenyl ethers were formulated in co-solvent and Self-Dispersing Lipid Formulations (SDLFs) and evaluated in a rapid murine Mtb infection model that assesses dissemination to and bacterial burden in the spleen. In vitro synergy studies were performed with the lead diphenyl ether compounds, SB-PT070 and SB-PT091, and rifampin (RIF), which demonstrated an additive effect, and that guided the in vivo studies. Combinatorial therapy in vivo studies with these compounds delivered in our Self-Micro Emulsifying Drug Delivery System (SMEDDS) resulted in an additional 1.4 log 10  CFU reduction in the spleen of animals co-treated with SB-PT091 and RIF and an additional 1.7 log 10 reduction in the spleen with animals treated with both SB-PT070 and RIF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Parallel O(log n) algorithms for open- and closed-chain rigid multibody systems based on a new mass matrix factorization technique

    NASA Technical Reports Server (NTRS)

    Fijany, Amir

    1993-01-01

    In this paper, parallel O(log n) algorithms for computation of rigid multibody dynamics are developed. These parallel algorithms are derived by parallelization of new O(n) algorithms for the problem. The underlying feature of these O(n) algorithms is a drastically different strategy for decomposition of interbody force which leads to a new factorization of the mass matrix (M). Specifically, it is shown that a factorization of the inverse of the mass matrix in the form of the Schur Complement is derived as M(exp -1) = C - B(exp *)A(exp -1)B, wherein matrices C, A, and B are block tridiagonal matrices. The new O(n) algorithm is then derived as a recursive implementation of this factorization of M(exp -1). For the closed-chain systems, similar factorizations and O(n) algorithms for computation of Operational Space Mass Matrix lambda and its inverse lambda(exp -1) are also derived. It is shown that these O(n) algorithms are strictly parallel, that is, they are less efficient than other algorithms for serial computation of the problem. But, to our knowledge, they are the only known algorithms that can be parallelized and that lead to both time- and processor-optimal parallel algorithms for the problem, i.e., parallel O(log n) algorithms with O(n) processors. The developed parallel algorithms, in addition to their theoretical significance, are also practical from an implementation point of view due to their simple architectural requirements.

  19. Sustained antibacterial effect of a hand rub gel incorporating chlorhexdine-loaded nanocapsules (Nanochlorex).

    PubMed

    Nhung, Dang Thi Tuyet; Freydiere, Anne-Marie; Constant, Hélène; Falson, Françoise; Pirot, Fabrice

    2007-04-04

    In the present study, an original chlorhexidine-loaded nanocapsule-based gel (Nanochlorex) was tested as hand rub gel against the resident skin flora in comparison with 2-propanol 60% (v/v) and 62% (v/v) ethanol-based gel (Purell). After 30-s hand rub, the immediate bactericidal effect of Nanochlorex was found comparable to 2-propanol 60% (v/v) (reduction factor, RF: 0.30+/-0.35 versus 0.38+/-0.55, P>0.05) against aerobic bacteria, whereas the post-values of surviving anaerobes were shown significantly lower from Nanochlorex (P<0.001) and insignificant from 2-propanol 60% (v/v) (P>0.05). Sustained antibacterial effect of Nanochlorex was confirmed against the resident and transient hand flora in two sets of experiment. In the first, the results obtained with the glove-juice technique showed that the bactericidal effect induced by Nanochlorex hand rub persisted throughout 3-h period, while Purell failed to reduce significantly the post-values of surviving bacteria. In the second, repeated artificial contaminations with Staphylococcus epidermidis was carried out onto ex vivo human skin pre-treated by either Nanochlorex or Purell for 5min, then maintained in cell diffusion apparatus for 4h. The log(10) reduction of surviving bacteria was significantly higher with Nanochlorex than that determined with Purell after three successive contaminations (from approximately 5.5 to 1.5 log(10) reduction for Nanochlorex between the first and the third contamination; approximately 1log(10) reduction for Purell throughout the experiment), confirming the sustained antibacterial effect of chlorhexidine-loaded nanocapsule-based gel. The immediate and sustained antibacterial effect of Nanochlorex was explained by chlorhexidine carrier system which improved the drug targeting to bacteria and reduced from osmotic gel further bacterial growth on the skin. Nanochlorex) might constitute a promising approach for hygienic hand disinfection in care practice performing multiple procedures.

  20. Association between placentome size, measured using transrectal ultrasonography, and gestational age in cattle.

    PubMed

    Adeyinka, F D; Laven, R A; Lawrence, K E; van Den Bosch, M; Blankenvoorde, G; Parkinson, T J

    2014-03-01

    The aim of this study was to estimate whether fetal age could be accurately estimated using placentome size. Fifty-eight cows with confirmed conception dates in two herds were used for the study. The length of the long axis and cross-sectional area of placentomes close to the cervix were measured once every 10 days between approximately 60-130 days of gestation and once every 15 days between 130-160 days of gestation. Four to six placentomes were measured using transrectal ultrasonography in each uterine horn. A linear mixed model was used to establish the factors that were significantly associated with log mean placentome length and to create an equation to predict gestational age from mean placentome length. Limits of agreement analysis was then used to evaluate whether the predictions were sufficiently accurate for mean placentome length to be used, in practice, as a method of determining gestational age. Only age of gestation (p<0.001) and uterine horn (p=0.048) were found to have a significant effect on log mean placentome length. From the three models used to predict gestational age the one that used log mean placentome length of all placentomes, adjusting for the effect of horn, had the smallest 95% limits of agreement; ±33 days. That is, predicted gestational age had a 95% chance of being between 33 days greater and 33.7 days less than actual age. This is approximately twice that reported in studies using measurement of fetal size. Measurement of placentomes near to the cervix using transrectal ultrasonography was easily achieved. There was a significant association between placentome size and gestational age, but between-cow variation in placentome size and growth resulted in poor agreement between placentome size and gestational age. Although placentomes can be easily visualised during diagnosis of pregnancy using transrectal ultrasonography, mean placentome size should not be used to estimate gestational age.

Top