Science.gov

Sample records for non-randomly distributed locations-exemplified

  1. Regulatory Considerations Of Waste Emplacement Within The WIPP Repository: Random Versus Non-Random Distribution

    SciTech Connect

    Casey, S. C.; Patterson, R. L.; Gross, M.; Lickliter, K.; Stein, J. S.

    2003-02-25

    The U.S. Department of Energy (DOE) is responsible for disposing of transuranic waste in the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. As part of that responsibility, DOE must comply with the U.S. Environmental Protection Agency's (EPA) radiation protection standards in Title 40 Code of Federal Regulations (CFR), Parts 191 and 194. This paper addresses compliance with the criteria of 40 CFR Section 194.24(d) and 194.24(f) that require DOE to either provide a waste loading scheme for the WIPP repository or to assume random emplacement in the mandated performance and compliance assessments. The DOE established a position on waste loading schemes during the process of obtaining the EPA's initial Certification in 1998. The justification for utilizing a random waste emplacement distribution within the WIPP repository was provided to the EPA. During the EPA rulemaking process for the initial certification, the EPA questioned DOE on whether waste would be loaded randomly as modeled in long-term performance assessment (PA) and the impact, if any, of nonrandom loading. In response, DOE conducted an impact assessment for non-random waste loading. The results of this assessment supported the contention that it does not matter whether random or non-random waste loading is assumed for the PA. The EPA determined that a waste loading plan was unnecessary because DOE had assumed random waste loading and evaluated the potential consequences of non-random loading for a very high activity waste stream. In other words, the EPA determined that DOE was not required to provide a waste loading scheme because compliance is not affected by the actual distribution of waste containers in the WIPP.

  2. Non-random distribution of DNA double-strand breaks induced by particle irradiation

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Cooper, P. K.; Rydberg, B.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    Induction of DNA double-strand breaks (dsbs) in mammalian cells is dependent on the spatial distribution of energy deposition from the ionizing radiation. For high LET particle radiations the primary ionization sites occur in a correlated manner along the track of the particles, while for X-rays these sites are much more randomly distributed throughout the volume of the cell. It can therefore be expected that the distribution of dsbs linearly along the DNA molecule also varies with the type of radiation and the ionization density. Using pulsed-field gel and conventional gel techniques, we measured the size distribution of DNA molecules from irradiated human fibroblasts in the total range of 0.1 kbp-10 Mbp for X-rays and high LET particles (N ions, 97 keV/microns and Fe ions, 150 keV/microns). On a mega base pair scale we applied conventional pulsed-field gel electrophoresis techniques such as measurement of the fraction of DNA released from the well (FAR) and measurement of breakage within a specific NotI restriction fragment (hybridization assay). The induction rate for widely spaced breaks was found to decrease with LET. However, when the entire distribution of radiation-induced fragments was analysed, we detected an excess of fragments with sizes below about 200 kbp for the particles compared with X-irradiation. X-rays are thus more effective than high LET radiations in producing large DNA fragments but less effective in the production of smaller fragments. We determined the total induction rate of dsbs for the three radiations based on a quantitative analysis of all the measured radiation-induced fragments and found that the high LET particles were more efficient than X-rays at inducing dsbs, indicating an increasing total efficiency with LET. Conventional assays that are based only on the measurement of large fragments are therefore misleading when determining total dsb induction rates of high LET particles. The possible biological significance of this non-randomness

  3. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Horz, Friedrich; Westphal, Andrew J.; Gainsforth, Zack; Borg, Janet; Djouadi, Zahia; Bridges, John; Franchi, Ian; Brownlee, Donald E.; Cheng. Andrew F.; Clark, Benton C.; Floss, Christine

    2007-01-01

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  4. Non-random distribution of macromolecules as driving forces for phenotypic variation.

    PubMed

    Jahn, Michael; Günther, Susanne; Müller, Susann

    2015-06-01

    Clonal populations employ many strategies of diversification to deal with constraints. All these strategies result in the generation of different phenotypes with diverse functions. Events like cell division are major sources of phenotypic variability due to the unequal partitioning of cellular components. In this review we concentrate on passive and active mechanisms cells employ to distribute macromolecules between their offspring. Different types of segregation are described, addressing both metabolically pertinent molecules such as PHA/PHB or polyphosphates, and components that adversely affect cells by promoting aging, such as damaged protein complexes or extrachromosomal rDNA circles. We also refer to mechanisms generating plasmid copy number (PCN) variation between cells in a population, and how elaborate partitioning systems counteract partitioning errors and ensure equal distribution. Finally, we demonstrate how simple differences in chromosomal copy number determine the fate of a cell, in this case the effect of gene dosage on the onset of sporulation in Bacillus subtilis or on a functional trait in Sinorhizobium meliloti.

  5. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    SciTech Connect

    Westphal, A J; Bastien, R K; Borg, J; Bridges, J; Brownlee, D E; Burchell, M J; Cheng, A F; Clark, B C; Djouadi, Z; Floss, C; Franchi, I; Gainsforth, Z; Graham, G; Green, S F; Heck, P R; Horanyi, M; Hoppe, P; Horz, F P; Huth, J; Kearsley, A; Leroux, H; Marhas, K; Nakamura-Messenger, K; Sandford, S A; See, T H; Stadermann, F J; Teslich, N E; Tsitrin, S; Warren, J L; Wozniakiewicz, P J; Zolensky, M E

    2007-04-06

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than {approx} 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  6. Non-Random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Westphal, Andrew J.; Bastien, Ronald K.; Borg, Janet; Bridges, John; Brownlee, Donald E.; Burchell, Mark J.; Cheng, Andrew F.; Clark, Benton C.; Djouadi, Zahia; Floss, Christine

    2007-01-01

    In January 2004, the Stardust spacecraft flew through the coma of comet P81/Wild2 at a relative speed of 6.1 km/sec. Cometary dust was collected at in a 0.1 sq m collector consisting of aerogel tiles and aluminum foils. Two years later, the samples successfully returned to earth and were recovered. We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than approx.10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a noncometary impact on the spacecraft bus just forward of the collector. Here we summarize the observations, and review the evidence for and against three scenarios that we have considered for explaining the impact clustering found on the Stardust aerogel and foil collectors.

  7. Non-random distribution of individual genetic diversity along an environmental gradient

    PubMed Central

    Porlier, Mélody; Bélisle, Marc; Garant, Dany

    2009-01-01

    Improving our knowledge of the links between ecology and evolution is especially critical in the actual context of global rapid environmental changes. A critical step in that direction is to quantify how variation in ecological factors linked to habitat modifications might shape observed levels of genetic variability in wild populations. Still, little is known on the factors affecting levels and distribution of genetic diversity at the individual level, despite its vital underlying role in evolutionary processes. In this study, we assessed the effects of habitat quality on population structure and individual genetic diversity of tree swallows (Tachycineta bicolor) breeding along a gradient of agricultural intensification in southern Québec, Canada. Using a landscape genetics approach, we found that individual genetic diversity was greater in poorer quality habitats. This counter-intuitive result was partly explained by the settlement patterns of tree swallows across the landscape. Individuals of higher genetic diversity arrived earlier on their breeding grounds and settled in the first available habitats, which correspond to intensive cultures. Our results highlight the importance of investigating the effects of environmental variability on individual genetic diversity, and of integrating information on landscape structure when conducting such studies. PMID:19414469

  8. Non-random cation distribution in hexagonal Al 0.5Ga 0.5PO 4

    NASA Astrophysics Data System (ADS)

    Kulshreshtha, S. K.; Jayakumar, O. D.; Sudarsan, V.

    2010-05-01

    Based on powder X-ray diffraction and 31P Magic Angle Spinning Nuclear Magnetic Resonance (MAS NMR) investigations of mixed phosphate Al 0.5Ga 0.5PO 4, prepared by co-precipitation method followed by annealing at 900 °C for 24 h, it is shown that Al 0.5Ga 0.5PO 4 phase crystallizes in hexagonal form with lattice parameter a=0.491(2) and c=1.106(4) nm. This hexagonal phase of Al 0.5Ga 0.5PO 4 is similar to that of pure GaPO 4. The 31P MAS NMR spectrum of the mixed phosphate sample consists of five peaks with systematic variation of their chemical shift values and is arising due to existence of P structural units having varying number of the Al 3+/Ga 3+ cations as the next nearest neighbors in the solid solution. Based on the intensity analysis of the component NMR spectra of Al 0.5Ga 0.5PO 4, it is inferred that the distribution of Al 3+ and Ga 3+ cations is non-random for the hexagonal Al 0.5Ga 0.5PO 4 sample although XRD patterns showed a well-defined solid solution formation.

  9. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats.

    PubMed

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A; Bortolotti, Gary R; Tella, José L

    2015-01-01

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes.

  10. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats

    PubMed Central

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A.; Bortolotti, Gary R.; Tella, José L.

    2015-01-01

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes. PMID:26348294

  11. Mechanisms by which the Hawksley random zero sphygmomanometer underestimates blood pressure and produces a non-random distribution of RZ values.

    PubMed

    Brown, W C; Kennedy, S; Inglis, G C; Murray, L S; Lever, A F

    1997-02-01

    Four faults are reported in the Hawksley Random Zero Sphygmomanometer (RZS). Our study is of their mechanism. (i) Compared with a mercury sphygmomanometer the RZS underestimates blood pressure (BP). We confirm this: for 240 measurements by three experienced operators in 12 patients, systolic BP was 3.4 mm Hg lower in the RZS; diastolic pressure was not underestimated. A cause of under-estimation in 89% of measurements was that mercury stuck in the manometer giving a false high reading of random zero (RZ). Tilting the RZS before reading RZ reduced under-reading by 1.6 mm Hg. A rare cause is failure of the operator to completely close the reservoir tap. (ii) Values of RZ are not randomly distributed; non-randomness is most marked in measurements made by experienced operators whose speed of measurement provides insufficient time during cuff inflation for filling of the diaphragm chamber. Smaller contributions are made by the sticking of mercury in the manometer and by a leak of air through the air bleed screw. (iii) Consecutive RZ estimates often have similar value. This has two causes: short cuff inflation time and short interval between opening the reservoir tap and spinning the thumb-wheel. (iv) An inverse relation of RZ and BP suggested by earlier work and by our own data is probably an artifact: when BP is low, measurement is quick and RZ is falsely high; when BP is high, measurement takes longer and RZ is lower. These four faults could be partly or wholly avoided by a change in the operators' technique.

  12. Non-random cation distribution in hexagonal Al{sub 0.5}Ga{sub 0.5}PO{sub 4}

    SciTech Connect

    Kulshreshtha, S.K.; Jayakumar, O.D.; Sudarsan, V.

    2010-05-15

    Based on powder X-ray diffraction and {sup 31}P Magic Angle Spinning Nuclear Magnetic Resonance (MAS NMR) investigations of mixed phosphate Al{sub 0.5}Ga{sub 0.5}PO{sub 4}, prepared by co-precipitation method followed by annealing at 900 deg. C for 24 h, it is shown that Al{sub 0.5}Ga{sub 0.5}PO{sub 4} phase crystallizes in hexagonal form with lattice parameter a=0.491(2) and c=1.106(4) nm. This hexagonal phase of Al{sub 0.5}Ga{sub 0.5}PO{sub 4} is similar to that of pure GaPO{sub 4}. The {sup 31}P MAS NMR spectrum of the mixed phosphate sample consists of five peaks with systematic variation of their chemical shift values and is arising due to existence of P structural units having varying number of the Al{sup 3+}/Ga{sup 3+} cations as the next nearest neighbors in the solid solution. Based on the intensity analysis of the component NMR spectra of Al{sub 0.5}Ga{sub 0.5}PO{sub 4}, it is inferred that the distribution of Al{sup 3+} and Ga{sup 3+} cations is non-random for the hexagonal Al{sub 0.5}Ga{sub 0.5}PO{sub 4} sample although XRD patterns showed a well-defined solid solution formation. - Graphical abstract: {sup 31}P MAS NMR pattern of hexagonal Al{sub 0.5}Ga{sub 0.5}PO{sub 4} solid solution.

  13. Non-Random Distribution of 5S rDNA Sites and Its Association with 45S rDNA in Plant Chromosomes.

    PubMed

    Roa, Fernando; Guerra, Marcelo

    2015-01-01

    5S and 45S rDNA sites are the best mapped chromosome regions in eukaryotic chromosomes. In this work, a database was built gathering information about the position and number of 5S rDNA sites in 784 plant species, aiming to identify patterns of distribution along the chromosomes and its correlation with the position of 45S rDNA sites. Data revealed that in most karyotypes (54.5%, including polyploids) two 5S rDNA sites (a single pair) are present, with 58.7% of all sites occurring in the short arm, mainly in the proximal region. In karyotypes of angiosperms with only 1 pair of sites (single sites) they are mostly found in the proximal region (52.0%), whereas in karyotypes with multiple sites the location varies according to the average chromosome size. Karyotypes with multiple sites and small chromosomes (<3 µm) often display proximal sites, while medium-sized (between 3 and 6 µm) and large chromosomes (>6 µm) more commonly show terminal or interstitial sites. In species with holokinetic chromosomes, the modal value of sites per karyotype was also 2, but they were found mainly in a terminal position. Adjacent 5S and 45S rDNA sites were often found in the short arm, reflecting the preferential distribution of both sites in this arm. The high frequency of genera with at least 1 species with adjacent 5S and 45S sites reveals that this association appeared several times during angiosperm evolution, but it has been maintained only rarely as the dominant array in plant genera.

  14. Prevalence and non-random distribution of exonic mutations in Interferon Regulatory Factor 6 (IRF6) in 307 families with Van der Woude syndrome and 37 families with popliteal pterygium syndrome

    PubMed Central

    Ferreira de Lima, Renata L. L.; Hoper, Sarah A.; Ghassibe, Michella; Cooper, Margaret E.; Rorick, Nicholas K.; Kondo, Shinji; Katz, Lori; Marazita, Mary L.; Compton, John; Bale, Sherri; Hehr, Ute; Dixon, Michael J.; Daack-Hirsch, Sandra; Boute, Odile; Bayet, Bénédicte; Revencu, Nicole; Verellen-Dumoulin, Christine; Vikkula, Miikka; Richieri-Costa, Antônio; Moretti-Ferreira, Danilo; Murray, Jeffrey C.; Schutte, Brian C.

    2009-01-01

    Purpose Interferon Regulatory Factor 6 (IRF6) encodes a member of the IRF family of transcription factors. Mutations in IRF6 cause Van der Woude (VWS) and popliteal pterygium syndromes (PPS), two related orofacial clefting disorders. Here, we compared and contrasted the frequency and distribution of exonic mutations in IRF6 between two large geographically distinct collections of families with VWS and between one collection of families with PPS. Methods We performed direct sequence analysis of IRF6 exons on samples from three collections, two with VWS and one with PPS. Results We identified mutations in IRF6 exons in 68% of families in both VWS collections and in 97% of families with PPS. In sum, 106 novel disease-causing variants were found. The distribution of mutations in the IRF6 exons in each collection was not random; exons 3, 4, 7, and 9 accounted for 80%. In the VWS collections, the mutations were evenly divided between protein truncation and missense, whereas most mutations identified in the PPS collection were missense. Further, the missense mutations associated with PPS were localized significantly to exon 4, at residues that are predicted to bind directly to DNA. Conclusion The non-random distribution of mutations in the IRF6 exons suggests a two-tier approach for efficient mutation screens for IRF6. The type and distribution of mutations are consistent with the hypothesis that VWS is caused by haploinsufficiency of IRF6. On the other hand, the distribution of PPS-associated mutations suggests a different, though not mutually exclusive, effect on IRF6 function. PMID:19282774

  15. Non-random patterns in viral diversity

    PubMed Central

    Anthony, Simon J.; Islam, Ariful; Johnson, Christine; Navarrete-Macias, Isamara; Liang, Eliza; Jain, Komal; Hitchens, Peta L.; Che, Xiaoyu; Soloyvov, Alexander; Hicks, Allison L.; Ojeda-Flores, Rafael; Zambrana-Torrelio, Carlos; Ulrich, Werner; Rostal, Melinda K.; Petrosov, Alexandra; Garcia, Joel; Haider, Najmul; Wolfe, Nathan; Goldstein, Tracey; Morse, Stephen S.; Rahman, Mahmudur; Epstein, Jonathan H.; Mazet, Jonna K.; Daszak, Peter; Lipkin, W. Ian

    2015-01-01

    It is currently unclear whether changes in viral communities will ever be predictable. Here we investigate whether viral communities in wildlife are inherently structured (inferring predictability) by looking at whether communities are assembled through deterministic (often predictable) or stochastic (not predictable) processes. We sample macaque faeces across nine sites in Bangladesh and use consensus PCR and sequencing to discover 184 viruses from 14 viral families. We then use network modelling and statistical null-hypothesis testing to show the presence of non-random deterministic patterns at different scales, between sites and within individuals. We show that the effects of determinism are not absolute however, as stochastic patterns are also observed. In showing that determinism is an important process in viral community assembly we conclude that it should be possible to forecast changes to some portion of a viral community, however there will always be some portion for which prediction will be unlikely. PMID:26391192

  16. Interval process model and non-random vibration analysis

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Ni, B. Y.; Liu, N. Y.; Han, X.; Liu, J.

    2016-07-01

    This paper develops an interval process model for time-varying or dynamic uncertainty analysis when information of the uncertain parameter is inadequate. By using the interval process model to describe a time-varying uncertain parameter, only its upper and lower bounds are required at each time point rather than its precise probability distribution, which is quite different from the traditional stochastic process model. A correlation function is defined for quantification of correlation between the uncertain-but-bounded variables at different times, and a matrix-decomposition-based method is presented to transform the original dependent interval process into an independent one for convenience of subsequent uncertainty analysis. More importantly, based on the interval process model, a non-random vibration analysis method is proposed for response computation of structures subjected to time-varying uncertain external excitations or loads. The structural dynamic responses thus can be derived in the form of upper and lower bounds, providing an important guidance for practical safety analysis and reliability design of structures. Finally, two numerical examples and one engineering application are investigated to demonstrate the feasibility of the interval process model and corresponding non-random vibration analysis method.

  17. Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring

    PubMed Central

    Miner, Daniel; Triesch, Jochen

    2016-01-01

    Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. PMID:26866369

  18. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... error are no longer suspended for non-random prepayment complex medical review. (d) Periodic re... that appears to have resumed a high level of payment error on non-random prepayment complex medical... 42 Public Health 3 2011-10-01 2011-10-01 false Termination and extension of non-random...

  19. Non-random DNA fragmentation in next-generation sequencing

    NASA Astrophysics Data System (ADS)

    Poptsova, Maria S.; Il'Icheva, Irina A.; Nechipurenko, Dmitry Yu.; Panchenko, Larisa A.; Khodikov, Mingian V.; Oparina, Nina Y.; Polozov, Robert V.; Nechipurenko, Yury D.; Grokhovsky, Sergei L.

    2014-03-01

    Next Generation Sequencing (NGS) technology is based on cutting DNA into small fragments, and their massive parallel sequencing. The multiple overlapping segments termed ``reads'' are assembled into a contiguous sequence. To reduce sequencing errors, every genome region should be sequenced several dozen times. This sequencing approach is based on the assumption that genomic DNA breaks are random and sequence-independent. However, previously we showed that for the sonicated restriction DNA fragments the rates of double-stranded breaks depend on the nucleotide sequence. In this work we analyzed genomic reads from NGS data and discovered that fragmentation methods based on the action of the hydrodynamic forces on DNA, produce similar bias. Consideration of this non-random DNA fragmentation may allow one to unravel what factors and to what extent influence the non-uniform coverage of various genomic regions.

  20. Non-random DNA fragmentation in next-generation sequencing

    PubMed Central

    Poptsova, Maria S.; Il'icheva, Irina A.; Nechipurenko, Dmitry Yu.; Panchenko, Larisa A.; Khodikov, Mingian V.; Oparina, Nina Y.; Polozov, Robert V.; Nechipurenko, Yury D.; Grokhovsky, Sergei L.

    2014-01-01

    Next Generation Sequencing (NGS) technology is based on cutting DNA into small fragments, and their massive parallel sequencing. The multiple overlapping segments termed “reads” are assembled into a contiguous sequence. To reduce sequencing errors, every genome region should be sequenced several dozen times. This sequencing approach is based on the assumption that genomic DNA breaks are random and sequence-independent. However, previously we showed that for the sonicated restriction DNA fragments the rates of double-stranded breaks depend on the nucleotide sequence. In this work we analyzed genomic reads from NGS data and discovered that fragmentation methods based on the action of the hydrodynamic forces on DNA, produce similar bias. Consideration of this non-random DNA fragmentation may allow one to unravel what factors and to what extent influence the non-uniform coverage of various genomic regions. PMID:24681819

  1. The non-random walk of chiral magnetic charge carriers in artificial spin ice

    PubMed Central

    Zeissler, K.; Walton, S. K.; Ladak, S.; Read, D. E.; Tyliszczak, T.; Cohen, L. F.; Branford, W. R.

    2013-01-01

    The flow of magnetic charge carriers (dubbed magnetic monopoles) through frustrated spin ice lattices, governed simply by Coulombic forces, represents a new direction in electromagnetism. Artificial spin ice nanoarrays realise this effect at room temperature, where the magnetic charge is carried by domain walls. Control of domain wall path is one important element of utilizing this new medium. By imaging the transit of domain walls across different connected 2D honeycomb structures we contribute an important aspect which will enable that control to be realized. Although apparently equivalent paths are presented to a domain wall as it approaches a Y-shaped vertex from a bar parallel to the field, we observe a stark non-random path distribution, which we attribute to the chirality of the magnetic charges. These observations are supported by detailed statistical modelling and micromagnetic simulations. The identification of chiral control to magnetic charge path selectivity invites analogy with spintronics. PMID:23409243

  2. Reducing bias in survival under non-random temporary emigration

    USGS Publications Warehouse

    Peñaloza, Claudia L.; Kendall, William L.; Langtimm, Catherine Ann

    2014-01-01

    Despite intensive monitoring, temporary emigration from the sampling area can induce bias severe enough for managers to discard life-history parameter estimates toward the terminus of the times series (terminal bias). Under random temporary emigration unbiased parameters can be estimated with CJS models. However, unmodeled Markovian temporary emigration causes bias in parameter estimates and an unobservable state is required to model this type of emigration. The robust design is most flexible when modeling temporary emigration, and partial solutions to mitigate bias have been identified, nonetheless there are conditions were terminal bias prevails. Long-lived species with high adult survival and highly variable non-random temporary emigration present terminal bias in survival estimates, despite being modeled with the robust design and suggested constraints. Because this bias is due to uncertainty about the fate of individuals that are undetected toward the end of the time series, solutions should involve using additional information on survival status or location of these individuals at that time. Using simulation, we evaluated the performance of models that jointly analyze robust design data and an additional source of ancillary data (predictive covariate on temporary emigration, telemetry, dead recovery, or auxiliary resightings) in reducing terminal bias in survival estimates. The auxiliary resighting and predictive covariate models reduced terminal bias the most. Additional telemetry data was effective at reducing terminal bias only when individuals were tracked for a minimum of two years. High adult survival of long-lived species made the joint model with recovery data ineffective at reducing terminal bias because of small-sample bias. The naïve constraint model (last and penultimate temporary emigration parameters made equal), was the least efficient, though still able to reduce terminal bias when compared to an unconstrained model. Joint analysis of several

  3. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... medical review; or (2) When calculation of the error rate indicates that the provider or supplier has... non-random prepayment complex medical review. If the reduction in the error rate is attributed to a 25... billing error are no longer suspended for non-random prepayment complex medical review. (d) Periodic...

  4. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... complex medical review. 421.505 Section 421.505 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... § 421.505 Termination and extension of non-random prepayment complex medical review. (a) Timeframe that a provider or supplier must be on non-random prepayment complex medical review. There is no...

  5. A non-random walk through the genome

    PubMed Central

    Oliver, Brian; Misteli, Tom

    2005-01-01

    Recent publications on a wide range of eukaryotes indicate that genes showing particular expression patterns are not randomly distributed in the genome but are clustered into contiguous regions that we call neighborhoods. It seems probable that this organization is related to chromatin and the structure of the nucleus. PMID:15833129

  6. Synaptic signal streams generated by ex vivo neuronal networks contain non-random, complex patterns.

    PubMed

    Lee, Sangmook; Zemianek, Jill M; Shultz, Abraham; Vo, Anh; Maron, Ben Y; Therrien, Mikaela; Courtright, Christina; Guaraldi, Mary; Yanco, Holly A; Shea, Thomas B

    2014-11-01

    Cultured embryonic neurons develop functional networks that transmit synaptic signals over multiple sequentially connected neurons as revealed by multi-electrode arrays (MEAs) embedded within the culture dish. Signal streams of ex vivo networks contain spikes and bursts of varying amplitude and duration. Despite the random interactions inherent in dissociated cultures, neurons are capable of establishing functional ex vivo networks that transmit signals among synaptically connected neurons, undergo developmental maturation, and respond to exogenous stimulation by alterations in signal patterns. These characteristics indicate that a considerable degree of organization is an inherent property of neurons. We demonstrate herein that (1) certain signal types occur more frequently than others, (2) the predominant signal types change during and following maturation, (3) signal predominance is dependent upon inhibitory activity, and (4) certain signals preferentially follow others in a non-reciprocal manner. These findings indicate that the elaboration of complex signal streams comprised of a non-random distribution of signal patterns is an emergent property of ex vivo neuronal networks.

  7. Identification of non-random sequence properties in groups of signature peptides obtained in random sequence peptide microarray experiments.

    PubMed

    Kuznetsov, Igor B

    2016-05-01

    Immunosignaturing is an emerging experimental technique that uses random sequence peptide microarrays to detect antibodies produced by the immune system in response to a particular disease. Two important questions regarding immunosignaturing are "Do microarray peptides that exhibit a strong affinity to a given type of antibodies share common sequence properties?" and "If so, what are those properties?" In this work, three statistical tests designed to detect non-random patterns in the amino acid makeup of a group of microarray peptides are presented. One test detects patterns of significantly biased amino acid usage, whereas the other two detect patterns of significant bias in the biochemical properties. These tests do not require a large number of peptides per group. The tests were applied to analyze 19 groups of peptides identified in immunosignaturing experiments as being specific for antibodies produced in response to various types of cancer and other diseases. The positional distribution of the biochemical properties of the amino acids in these 19 peptide groups was also studied. Remarkably, despite the random nature of the sequence libraries used to design the microarrays, a unique group-specific non-random pattern was identified in the majority of the peptide groups studied. © 2016 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 106: 318-329, 2016. PMID:27037995

  8. Non-random structures in universal compression and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Gurzadyan, A. V.; Allahverdyan, A. E.

    2016-02-01

    We study the hypothesis of information panspermia assigned recently among possible solutions of the Fermi paradox ("where are the aliens?"). It suggests that the expenses of alien signaling can be significantly reduced, if their messages contained compressed information. To this end we consider universal compression and decoding mechanisms ( e.g. the Lempel-Ziv-Welch algorithm) that can reveal non-random structures in compressed bit strings. The efficiency of the Kolmogorov stochasticity parameter for detection of non-randomness is illustrated, along with the Zipf's law. The universality of these methods, i.e. independence from data details, can be principal in searching for intelligent messages.

  9. Non-Random Integration of the HPV Genome in Cervical Cancer

    PubMed Central

    Schmitz, Martina; Driesch, Corina; Jansen, Lars; Runnebaum, Ingo B.; Dürst, Matthias

    2012-01-01

    HPV DNA integration into the host genome is a characteristic but not an exclusive step during cervical carcinogenesis. It is still a matter of debate whether viral integration contributes to the transformation process beyond ensuring the constitutive expression of the viral oncogenes. There is mounting evidence for a non-random distribution of integration loci and the direct involvement of cellular cancer-related genes. In this study we addressed this topic by extending the existing data set by an additional 47 HPV16 and HPV18 positive cervical carcinoma. We provide supportive evidence for previously defined integration hotspots and have revealed another cluster of integration sites within the cytogenetic band 3q28. Moreover, in the vicinity of these hotspots numerous microRNAs (miRNAs) are located and may be influenced by the integrated HPV DNA. By compiling our data and published reports 9 genes could be identified which were affected by HPV integration at least twice in independent tumors. In some tumors the viral-cellular fusion transcripts were even identical with respect to the viral donor and cellular acceptor sites used. However, the exact integration sites are likely to differ since none of the integration sites analysed thus far have shown more than a few nucleotides of homology between viral and host sequences. Therefore, DNA recombination involving large stretches of homology at the integration site can be ruled out. It is however intriguing that by sequence alignment several regions of the HPV16 genome were found to have highly homologous stretches of up to 50 nucleotides to the aforementioned genes and the integration hotspots. One common region of homologies with cellular sequences is between the viral gene E5 and L2 (nucleotides positions 4100 to 4240). We speculate that this and other regions of homology are involved in the integration process. Our observations suggest that targeted disruption, possibly also of critical cellular genes, by HPV

  10. Bayesian hierarchical modeling for a non-randomized, longitudinal fall prevention trial with spatially correlated observations

    PubMed Central

    Murphy, T. E.; Allore, H. G.; Leo-Summers, L.; Carlin, B. P.

    2012-01-01

    Because randomization of participants is often not feasible in community-based health interventions, non-randomized designs are commonly employed. Non-randomized designs may have experimental units that are spatial in nature, such as zip codes that are characterized by aggregate statistics from sources like the U.S. census and the Centers for Medicare and Medicaid Services. A perennial concern with non-randomized designs is that even after careful balancing of influential covariates, bias may arise from unmeasured factors. In addition to facilitating the analysis of interventional designs based on spatial units, Bayesian hierarchical modeling can quantify unmeasured variability with spatially correlated residual terms. Graphical analysis of these spatial residuals demonstrates whether variability from unmeasured covariates is likely to bias the estimates of interventional effect. The Connecticut Collaboration for Fall Prevention is the first large-scale longitudinal trial of a community-wide healthcare intervention designed to prevent injurious falls in older adults. Over a two-year evaluation phase, this trial demonstrated a rate of fall-related utilization at hospitals and emergency departments by persons 70 years and older in the intervention area that was 11 per cent less than that of the usual care area, and a 9 per cent lower rate of utilization from serious injuries. We describe the Bayesian hierarchical analysis of this non-randomized intervention with emphasis on its spatial and longitudinal characteristics. We also compare several models, using posterior predictive simulations and maps of spatial residuals. PMID:21294148

  11. The quality of control groups in non-randomized studies published in Journal of Hand Surgery

    PubMed Central

    Johnson, Shepard P.; Malay, Sunitha; Chung, Kevin C.

    2016-01-01

    Purpose To evaluate control group selection in non-randomized studies published in the Journal of Hand Surgery American (JHS). Methods We reviewed all papers published in JHS in 2013 to identify studies that used non-randomized control groups. Data collected included type of study design and control group characteristics. We then appraised studies to determine if authors discussed confounding and selection bias and how they controlled for confounding. Results Thirty-seven non-randomized studies were published in JHS in 2013. The source of control was either the same institution as the study group, a different institution, a database, or not provided in the manuscript. Twenty-nine (78%) studies statistically compared key characteristics between control and study group. Confounding was controlled with matching, exclusion criteria, or regression analysis. Twenty-two (59%) papers explicitly discussed the threat of confounding and 18(49%) identified sources of selection bias. Conclusions In our review of non-randomized studies published in JHS, papers had well-defined controls that were similar to the study group, allowing for reasonable comparisons. However, we identified substantial confounding and bias that were not addressed as explicit limitations, which might lead the reader to overestimate the scientific validity of the data. Clinical relevance Incorporating a brief discussion of control group selection in scientific manuscripts should help readers interpret the study more appropriately. Authors, reviewers, and editors should strive to address this component of clinical importance. PMID:25447000

  12. Non-random distribution of homo-repeats: links with biological functions and human diseases

    PubMed Central

    Lobanov, Michail Yu.; Klus, Petr; Sokolovsky, Igor V.; Tartaglia, Gian Gaetano; Galzitskaya, Oxana V.

    2016-01-01

    The biological function of multiple repetitions of single amino acids, or homo-repeats, is largely unknown, but their occurrence in proteins has been associated with more than 20 hereditary diseases. Analysing 122 bacterial and eukaryotic genomes, we observed that the number of proteins containing homo-repeats is significantly larger than expected from theoretical estimates. Analysis of statistical significance indicates that the minimal size of homo-repeats varies with amino acid type and proteome. In an attempt to characterize proteins harbouring long homo-repeats, we found that those containing polar or small amino acids S, P, H, E, D, K, Q and N are enriched in structural disorder as well as protein- and RNA-interactions. We observed that E, S, Q, G, L, P, D, A and H homo-repeats are strongly linked with occurrence in human diseases. Moreover, S, E, P, A, Q, D and T homo-repeats are significantly enriched in neuronal proteins associated with autism and other disorders. We release a webserver for further exploration of homo-repeats occurrence in human pathology at http://bioinfo.protres.ru/hradis/. PMID:27256590

  13. Non-random subcellular distribution of variant EKLF in erythroid cells

    PubMed Central

    Quadrini, Karen J.; Gruzglin, Eugenia; Bieker, James J.

    2008-01-01

    EKLF protein plays a prominent role during erythroid development as a nuclear transcription factor. Not surprisingly, exogenous EKLF quickly localizes to the nucleus. However, using two different assays we have unexpectedly found that a substantial proportion of endogenous EKLF resides in the cytoplasm at steady state in all erythroid cells examined. While EKLF localization does not appear to change during either erythroid development or terminal differentiation, we find that the protein displays subtle yet distinct biochemical and functional differences depending on which subcellular compartment it is isolated from, with PEST sequences possibly playing a role in these differences. Localization is unaffected by inhibition of CRM1 activity and the two populations are not differentiated by stability. Heterokaryon assays demonstrate that EKLF is able to shuttle out of the nucleus although its nuclear re-entry is rapid. These studies suggest there is an unexplored role for EKLF in the cytoplasm that is separate from its well-characterized nuclear function. PMID:18329016

  14. Non-random subcellular distribution of variant EKLF in erythroid cells

    SciTech Connect

    Quadrini, Karen J.; Gruzglin, Eugenia; Bieker, James J.

    2008-04-15

    EKLF protein plays a prominent role during erythroid development as a nuclear transcription factor. Not surprisingly, exogenous EKLF quickly localizes to the nucleus. However, using two different assays we have unexpectedly found that a substantial proportion of endogenous EKLF resides in the cytoplasm at steady state in all erythroid cells examined. While EKLF localization does not appear to change during either erythroid development or terminal differentiation, we find that the protein displays subtle yet distinct biochemical and functional differences depending on which subcellular compartment it is isolated from, with PEST sequences possibly playing a role in these differences. Localization is unaffected by inhibition of CRM1 activity and the two populations are not differentiated by stability. Heterokaryon assays demonstrate that EKLF is able to shuttle out of the nucleus although its nuclear re-entry is rapid. These studies suggest there is an unexplored role for EKLF in the cytoplasm that is separate from its well-characterized nuclear function.

  15. Checklists of Methodological Issues for Review Authors to Consider When Including Non-Randomized Studies in Systematic Reviews

    ERIC Educational Resources Information Center

    Wells, George A.; Shea, Beverley; Higgins, Julian P. T.; Sterne, Jonathan; Tugwell, Peter; Reeves, Barnaby C.

    2013-01-01

    Background: There is increasing interest from review authors about including non-randomized studies (NRS) in their systematic reviews of health care interventions. This series from the Ottawa Non-Randomized Studies Workshop consists of six papers identifying methodological issues when doing this. Aim: To format the guidance from the preceding…

  16. Revisiting the impacts of non-random extinction on the tree-of-life.

    PubMed

    Davies, T Jonathan; Yessoufou, Kowiyou

    2013-08-23

    The tree-of-life represents the diversity of living organisms. Species extinction and the concomitant loss of branches from the tree-of-life is therefore a major conservation concern. There is increasing evidence indicating that extinction is phylogenetically non-random, such that if one species is vulnerable to extinction so too are its close relatives. However, the impact of non-random extinctions on the tree-of-life has been a matter of recent debate. Here, we combine simulations with empirical data on extinction risk in mammals. We demonstrate that phylogenetically clustered extinction leads to a disproportionate loss of branches from the tree-of-life, but that the loss of their summed lengths is indistinguishable from random extinction. We argue that under a speciational model of evolution, the number of branches lost might be of equal or greater consequences than the loss of summed branch lengths. We therefore suggest that the impact of non-random extinction on the tree-of-life may have been underestimated.

  17. Non-random biodiversity loss underlies predictable increases in viral disease prevalence.

    PubMed

    Lacroix, Christelle; Jolles, Anna; Seabloom, Eric W; Power, Alison G; Mitchell, Charles E; Borer, Elizabeth T

    2014-03-01

    Disease dilution (reduced disease prevalence with increasing biodiversity) has been described for many different pathogens. Although the mechanisms causing this phenomenon remain unclear, the disassembly of communities to predictable subsets of species, which can be caused by changing climate, land use or invasive species, underlies one important hypothesis. In this case, infection prevalence could reflect the competence of the remaining hosts. To test this hypothesis, we measured local host species abundance and prevalence of four generalist aphid-vectored pathogens (barley and cereal yellow dwarf viruses) in a ubiquitous annual grass host at 10 sites spanning 2000 km along the North American West Coast. In laboratory and field trials, we measured viral infection as well as aphid fecundity and feeding preference on several host species. Virus prevalence increased as local host richness declined. Community disassembly was non-random: ubiquitous hosts dominating species-poor assemblages were among the most competent for vector production and virus transmission. This suggests that non-random biodiversity loss led to increased virus prevalence. Because diversity loss is occurring globally in response to anthropogenic changes, such work can inform medical, agricultural and veterinary disease research by providing insights into the dynamics of pathogens nested within a complex web of environmental forces.

  18. Non-random mate choice in humans: insights from a genome scan.

    PubMed

    Laurent, R; Toupance, B; Chaix, R

    2012-02-01

    Little is known about the genetic factors influencing mate choice in humans. Still, there is evidence for non-random mate choice with respect to physical traits. In addition, some studies suggest that the Major Histocompatibility Complex may affect pair formation. Nowadays, the availability of high density genomic data sets gives the opportunity to scan the genome for signatures of non-random mate choice without prior assumptions on which genes may be involved, while taking into account socio-demographic factors. Here, we performed a genome scan to detect extreme patterns of similarity or dissimilarity among spouses throughout the genome in three populations of African, European American, and Mexican origins from the HapMap 3 database. Our analyses identified genes and biological functions that may affect pair formation in humans, including genes involved in skin appearance, morphogenesis, immunity and behaviour. We found little overlap between the three populations, suggesting that the biological functions potentially influencing mate choice are population specific, in other words are culturally driven. Moreover, whenever the same functional category of genes showed a significant signal in two populations, different genes were actually involved, which suggests the possibility of evolutionary convergences.

  19. Impact of non-random vibrations in Mössbauer rotor experiments testing time dilation

    NASA Astrophysics Data System (ADS)

    Friedman, Y.; Nowik, I.; Felner, I.; Steiner, J. M.; Yudkin, E.; Livshitz, S.; Wille, H.-C.; Wortmann, G.; Arogeti, S.; Levy, R.; Chumakov, A. I.; Rüffer, R.

    2016-06-01

    All experiments testing time dilation by measuring the spectral shift of a rotating Mössbauer absorber assume that vibrations do not affect the spectral shift because of their purely random nature and claim that the observed shift is due to time dilation only. Our recent experiment using the Synchrotron Mössbauer Source at ESRF revealed, however, a shift due to the non-random periodic vibration patterns caused by the rotor/bearing system. These patterns fit the predictions of the Jeffcott model for such a system with non-zero eccentricity. We have calculated this shift due to the non-random vibrations and the resulting relative shift between two states when the acceleration of the absorber is anti-parallel and parallel to the source. This relative shift exhibits the same behavior as the observed relative shift. Hence, the effect of the spectral shift due to vibrations cannot be ignored in any Mössbauer rotor experiments for testing time dilation. Recommendations for improvement of future rotor experiments testing time dilation are presented.

  20. Non-random biodiversity loss underlies predictable increases in viral disease prevalence.

    PubMed

    Lacroix, Christelle; Jolles, Anna; Seabloom, Eric W; Power, Alison G; Mitchell, Charles E; Borer, Elizabeth T

    2014-03-01

    Disease dilution (reduced disease prevalence with increasing biodiversity) has been described for many different pathogens. Although the mechanisms causing this phenomenon remain unclear, the disassembly of communities to predictable subsets of species, which can be caused by changing climate, land use or invasive species, underlies one important hypothesis. In this case, infection prevalence could reflect the competence of the remaining hosts. To test this hypothesis, we measured local host species abundance and prevalence of four generalist aphid-vectored pathogens (barley and cereal yellow dwarf viruses) in a ubiquitous annual grass host at 10 sites spanning 2000 km along the North American West Coast. In laboratory and field trials, we measured viral infection as well as aphid fecundity and feeding preference on several host species. Virus prevalence increased as local host richness declined. Community disassembly was non-random: ubiquitous hosts dominating species-poor assemblages were among the most competent for vector production and virus transmission. This suggests that non-random biodiversity loss led to increased virus prevalence. Because diversity loss is occurring globally in response to anthropogenic changes, such work can inform medical, agricultural and veterinary disease research by providing insights into the dynamics of pathogens nested within a complex web of environmental forces. PMID:24352672

  1. Non-random mating in classical lekking grouse species: seasonal and diurnal trends

    NASA Astrophysics Data System (ADS)

    Tsuji, L. J. S.; DeIuliis, G.; Hansell, R. I. C.; Kozlovic, D. R.; Sokolowski, M. B.

    This paper is the first to integrate both field and theoretical approaches to demonstrate that fertility benefits can be a direct benefit to females mating on the classical lek. Field data collected for male sharp-tailed grouse (Tympanuchus phasianellus), a classical lekking species, revealed potential fertility benefits for selective females. Adult males and individuals occupying centrally located territories on the lek were found to have significantly larger testes than juveniles and peripheral individuals. Further, using empirical data from previously published studies of classical lekking grouse species, time-series analysis was employed to illustrate that female mating patterns, seasonal and daily, were non-random. We are the first to show that these patterns coincide with times when male fertility is at its peak.

  2. Non-random aneuploidy specifies subgroups of pilocytic astrocytoma and correlates with older age

    PubMed Central

    Khuong-Quang, Dong-Anh; Bechet, Denise; Gayden, Tenzin; Kool, Marcel; De Jay, Nicolas; Jacob, Karine; Gerges, Noha; Hutter, Barbara; Şeker-Cin, Huriye; Witt, Hendrik; Montpetit, Alexandre; Brunet, Sébastien; Lepage, Pierre; Bourret, Geneviève; Klekner, Almos; Bognár, László; Hauser, Peter; Garami, Miklós; Farmer, Jean-Pierre; Montes, Jose-Luis; Atkinson, Jeffrey; Lambert, Sally; Kwan, Tony; Korshunov, Andrey; Tabori, Uri; Collins, V. Peter; Albrecht, Steffen; Faury, Damien; Pfister, Stefan M.; Paulus, Werner; Hasselblatt, Martin; Jones, David T.W.; Jabado, Nada

    2015-01-01

    Pilocytic astrocytoma (PA) is the most common brain tumor in children but is rare in adults, and hence poorly studied in this age group. We investigated 222 PA and report increased aneuploidy in older patients. Aneuploid genomes were identified in 45% of adult compared with 17% of pediatric PA. Gains were non-random, favoring chromosomes 5, 7, 6 and 11 in order of frequency, and preferentially affecting non-cerebellar PA and tumors with BRAF V600E mutations and not with KIAA1549-BRAF fusions or FGFR1 mutations. Aneuploid PA differentially expressed genes involved in CNS development, the unfolded protein response, and regulators of genomic stability and the cell cycle (MDM2, PLK2),whose correlated programs were overexpressed specifically in aneuploid PA compared to other glial tumors. Thus, convergence of pathways affecting the cell cycle and genomic stability may favor aneuploidy in PA, possibly representing an additional molecular driver in older patients with this brain tumor. PMID:26378811

  3. Non-random decay of chordate characters causes bias in fossil interpretation.

    PubMed

    Sansom, Robert S; Gabbott, Sarah E; Purnell, Mark A

    2010-02-11

    Exceptional preservation of soft-bodied Cambrian chordates provides our only direct information on the origin of vertebrates. Fossil chordates from this interval offer crucial insights into how the distinctive body plan of vertebrates evolved, but reading this pre-biomineralization fossil record is fraught with difficulties, leading to controversial and contradictory interpretations. The cause of these difficulties is taphonomic: we lack data on when and how important characters change as they decompose, resulting in a lack of constraint on anatomical interpretation and a failure to distinguish phylogenetic absence of characters from loss through decay. Here we show, from experimental decay of amphioxus and ammocoetes, that loss of chordate characters during decay is non-random: the more phylogenetically informative are the most labile, whereas plesiomorphic characters are decay resistant. The taphonomic loss of synapomorphies and relatively higher preservation potential of chordate plesiomorphies will thus result in bias towards wrongly placing fossils on the chordate stem. Application of these data to Cathaymyrus (Cambrian period of China) and Metaspriggina (Cambrian period of Canada) highlights the difficulties: these fossils cannot be placed reliably in the chordate or vertebrate stem because they could represent the decayed remains of any non-biomineralized, total-group chordate. Preliminary data suggest that this decay filter also affects other groups of organisms and that 'stem-ward slippage' may be a widespread but currently unrecognized bias in our understanding of the early evolution of a number of phyla.

  4. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. PMID:26524140

  5. Non-random assembly of bacterioplankton communities in the subtropical north pacific ocean.

    PubMed

    Eiler, Alexander; Hayakawa, Darin H; Rappé, Michael S

    2011-01-01

    The exploration of bacterial diversity in the global ocean has revealed new taxa and previously unrecognized metabolic potential; however, our understanding of what regulates this diversity is limited. Using terminal restriction fragment length polymorphism (T-RFLP) data from bacterial small-subunit ribosomal RNA genes we show that, independent of depth and time, a large fraction of bacterioplankton co-occurrence patterns are non-random in the oligotrophic North Pacific subtropical gyre (NPSG). Pair-wise correlations of all identified operational taxonomic units (OTUs) revealed a high degree of significance, with 6.6% of the pair-wise co-occurrences being negatively correlated and 20.7% of them being positive. The most abundant OTUs, putatively identified as Prochlorococcus, SAR11, and SAR116 bacteria, were among the most correlated OTUs. As expected, bacterial community composition lacked statistically significant patterns of seasonality in the mostly stratified water column except in a few depth horizons of the sunlit surface waters, with higher frequency variations in community structure apparently related to populations associated with the deep chlorophyll maximum. Communities were structured vertically into epipelagic, mesopelagic, and bathypelagic populations. Permutation-based statistical analyses of T-RFLP data and their corresponding metadata revealed a broad range of putative environmental drivers controlling bacterioplankton community composition in the NPSG, including concentrations of inorganic nutrients and phytoplankton pigments. Together, our results suggest that deterministic forces such as environmental filtering and interactions among taxa determine bacterioplankton community patterns, and consequently affect ecosystem functions in the NPSG.

  6. The MHC and non-random mating in a captive population of Chinook salmon.

    PubMed

    Neff, B D; Garner, S R; Heath, J W; Heath, D D

    2008-08-01

    Detailed analysis of variation in reproductive success can provide an understanding of the selective pressures that drive the evolution of adaptations. Here, we use experimental spawning channels to assess phenotypic and genotypic correlates of reproductive success in Chinook salmon (Oncorhynchus tshawytscha). Groups of 36 fish in three different sex ratios (1:2, 1:1 and 2:1) were allowed to spawn and the offspring were collected after emergence from the gravel. Microsatellite genetic markers were used to assign parentage of each offspring, and the parents were also typed at the major histocompatibility class IIB locus (MHC). We found that large males, and males with brighter coloration and a more green/blue hue on their lateral integument sired more offspring, albeit only body size and brightness had independent effects. There was no similar relationship between these variables and female reproductive success. Furthermore, there was no effect of sex ratio on the strength or significance of any of the correlations. Females mated non-randomly at the MHC, appearing to select mates that produced offspring with greater genetic diversity as measured by amino-acid divergence. Females mated randomly with respect to male genetic relatedness and males mated randomly with respect to both MHC and genetic relatedness. These results indicate that sexual selection favours increased body size and perhaps integument coloration in males as well as increases genetic diversity at the MHC by female mate choice.

  7. Non-Random Assembly of Bacterioplankton Communities in the Subtropical North Pacific Ocean

    PubMed Central

    Eiler, Alexander; Hayakawa, Darin H.; Rappé, Michael S.

    2011-01-01

    The exploration of bacterial diversity in the global ocean has revealed new taxa and previously unrecognized metabolic potential; however, our understanding of what regulates this diversity is limited. Using terminal restriction fragment length polymorphism (T-RFLP) data from bacterial small-subunit ribosomal RNA genes we show that, independent of depth and time, a large fraction of bacterioplankton co-occurrence patterns are non-random in the oligotrophic North Pacific subtropical gyre (NPSG). Pair-wise correlations of all identified operational taxonomic units (OTUs) revealed a high degree of significance, with 6.6% of the pair-wise co-occurrences being negatively correlated and 20.7% of them being positive. The most abundant OTUs, putatively identified as Prochlorococcus, SAR11, and SAR116 bacteria, were among the most correlated OTUs. As expected, bacterial community composition lacked statistically significant patterns of seasonality in the mostly stratified water column except in a few depth horizons of the sunlit surface waters, with higher frequency variations in community structure apparently related to populations associated with the deep chlorophyll maximum. Communities were structured vertically into epipelagic, mesopelagic, and bathypelagic populations. Permutation-based statistical analyses of T-RFLP data and their corresponding metadata revealed a broad range of putative environmental drivers controlling bacterioplankton community composition in the NPSG, including concentrations of inorganic nutrients and phytoplankton pigments. Together, our results suggest that deterministic forces such as environmental filtering and interactions among taxa determine bacterioplankton community patterns, and consequently affect ecosystem functions in the NPSG. PMID:21747815

  8. Performance of statistical methods for analysing survival data in the presence of non-random compliance.

    PubMed

    Odondi, Lang'o; McNamee, Roseanne

    2010-12-20

    Noncompliance often complicates estimation of treatment efficacy from randomized trials. Under random noncompliance, per protocol analyses or even simple regression adjustments for noncompliance, could be adequate for causal inference, but special methods are needed when noncompliance is related to risk. For survival data, Robins and Tsiatis introduced the semi-parametric structural Causal Accelerated Life Model (CALM) which allows time-dependent departures from randomized treatment in either arm and relates each observed event time to a potential event time that would have been observed if the control treatment had been given throughout the trial. Alternatively, Loeys and Goetghebeur developed a structural Proportional Hazards (C-Prophet) model for when there is all-or-nothing noncompliance in the treatment arm only. Whitebiet al. proposed a 'complier average causal effect' method for Proportional Hazards estimation which allows time-dependent departures from randomized treatment in the active arm. A time-invariant version of this estimator (CHARM) consists of a simple adjustment to the Intention-to-Treat hazard ratio estimate. We used simulation studies mimicking a randomized controlled trial of active treatment versus control with censored time-to-event data, and under both random and non-random time-dependent noncompliance, to evaluate performance of these methods in terms of 95 per cent confidence interval coverage, bias and root mean square errors (RMSE). All methods performed well in terms of bias, even the C-Prophet used after treating time-varying compliance as all-or-nothing. Coverage of the latter method, as implemented in Stata, was too low. The CALM method performed best in terms of bias and coverage but had the largest RMSE. PMID:20963732

  9. Non-random correlation structures and dimensionality reduction in multivariate climate data

    NASA Astrophysics Data System (ADS)

    Vejmelka, Martin; Pokorná, Lucie; Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Paluš, Milan

    2015-05-01

    It is well established that the global climate is a complex phenomenon with dynamics driven by the interaction of a multitude of identifiable but intertwined subsystems. The identification, at some level, of these subsystems is an important step towards understanding climate dynamics. We present a method to determine the number of principal components representing non-random correlation structures in climate data, or components that cannot be generated by a surrogate model of independent stochastic processes replicating the auto-correlation structure of each time series. The purpose of the method is to automatically reduce the dimensionality of large climate datasets into spatially localised components suitable for further interpretation or, for example, for use as nodes in a complex network analysis of large-scale climate dynamics. We apply the method to two 2.5° resolution NCEP/NCAR reanalysis global datasets of monthly means: the sea level pressure (SLP) and the surface air temperature (SAT), and extract 60 components explaining 87 % variance and 68 components explaining 72 % variance, respectively. The obtained components are in agreement with previous results in that they recover many well-known climate modes previously identified using other approaches including regionally constrained principal component analysis. Selected SLP components are discussed in more detail with respect to their correlation with important climate indices and their relationship to other SLP and SAT components. Finally, we consider a subset of the obtained components that have not yet been explicitly identified by other authors but seem plausible in the context of regional climate observations discussed in literature.

  10. Intrauterine synechiae after myomectomy; laparotomy versus laparoscopy: Non-randomized interventional trial

    PubMed Central

    Asgari, Zahra; Hafizi, Leili; Hosseini, Rayhaneh; Javaheri, Atiyeh; Rastad, Hathis

    2015-01-01

    Background: Leiomyomata is the most frequent gynecological neoplasm. One of the major complications of myomectomy is intrauterine adhesion (synechiae). Objective: To evaluate and compare the rate and severity of synechiae formation after myomectomy by laparotomy and laparoscopy. Materials and Methods: In this non-randomized interventional trial, hysteroscopy was performed in all married fertile women who had undergone myomectomy (type 3-6 interamural and subserosal fibroids) via laparotomy and laparoscopy in Tehran’s Arash Hospital from 2010 to 2013. Three months after the operation, the occurrence rate and severity of intrauterine synechiae, and its relationship with type, number and location of myomas were investigated and compared in both groups. Results: Forty patients (19 laparoscopy and 21 laparotomy cases) were studied. Both groups were similar regarding the size, type (subserosal or intramural), number and location of myoma. The occurrence rate of synechiae in the laparoscopy and laparotomy group was 21% and 19%, respectively; showing no significant difference (p=0.99). Among all patients, no significant relationship was found between the endometrial opening (p=0.92), location (p=0.14) and type of myoma (p=0.08) with the occurrence rate of synechiae. However, a significant relationship was observed between myoma’s size (p=0.01) and the location of the largest myoma with the occurrence of synechiae (p=0.02). Conclusion: With favorable suturing methods, the outcome of intrauterine synechiae formation after myomectomy, either performed by laparotomy or laparoscopy, is similar. In all cases of myomectomy in reproductive-aged women, postoperative hysteroscopy is highly recommended to better screen intrauterine synechiae. PMID:26000007

  11. Non-random fragmentation patterns in circulating cell-free DNA reflect epigenetic regulation

    PubMed Central

    2015-01-01

    Background The assessment of cell-free circulating DNA fragments, also known as a "liquid biopsy" of the patient's plasma, is an important source for the discovery and subsequent non-invasive monitoring of cancer and other pathological conditions. Although the nucleosome-guided fragmentation patterns of cell-free DNA (cfDNA) have not yet been studied in detail, non-random representation of cfDNA sequencies may reflect chromatin features in the tissue of origin at gene-regulation level. Results In this study, we investigated the association between epigenetic landscapes of human tissues evident in the patterns of cfDNA in plasma by deep sequencing of human cfDNA samples. We have demonstrated that baseline characteristics of cfDNA fragmentation pattern are in concordance with the ones corresponding to cell lines-derived. To identify the loci differentially represented in cfDNA fragment, we mapped the transcription start sites within the sequenced cfDNA fragments and tested for association of these genomic coordinates with the relative strength and the patterns of gene expressions. Preselected sets of house-keeping and tissue specific genes were used as models for actively expressed and silenced genes. Developed measure of gene regulation was able to differentiate these two sets based on sequencing coverage near gene transcription start site. Conclusion Experimental outcomes suggest that cfDNA retains characteristics previously noted in genome-wide analysis of chromatin structure, in particular, in MNase-seq assays. Thus far the analysis of the DNA fragmentation pattern may aid further developing of cfDNA based biomarkers for a variety of human conditions. PMID:26693644

  12. Cortical atrophy patterns in multiple sclerosis are non-random and clinically relevant.

    PubMed

    Steenwijk, Martijn D; Geurts, Jeroen J G; Daams, Marita; Tijms, Betty M; Wink, Alle Meije; Balk, Lisanne J; Tewarie, Prejaas K; Uitdehaag, Bernard M J; Barkhof, Frederik; Vrenken, Hugo; Pouwels, Petra J W

    2016-01-01

    of two cortical thickness patterns (bilateral sensorimotor cortex and bilateral insula), and global cortical thickness. The final model predicting average cognition (adjusted R(2) = 0.469; P < 0.001) consisted of age, the loadings of two cortical thickness patterns (bilateral posterior cingulate cortex and bilateral temporal pole), overall white matter lesion load and normal-appearing white matter integrity. Although white matter pathology measures were part of the final clinical regression models, they explained limited incremental variance (to a maximum of 4%). Several cortical atrophy patterns relevant for multiple sclerosis were found. This suggests that cortical atrophy in multiple sclerosis occurs largely in a non-random manner and develops (at least partly) according to distinct anatomical patterns. In addition, these cortical atrophy patterns showed stronger associations with clinical (especially cognitive) dysfunction than global cortical atrophy.

  13. A 4-Mb deletion in the region Xq27.3-q28 is associated with non-random inactivation of the non-mutant X chromosome

    SciTech Connect

    Clarke, J.T.R.; Han, L.P.; Michalickova, K.

    1994-09-01

    A girl with severe Hunter disease was found to have a submicroscopic deletion distrupting the IDS locus in the region Xq27.3-q28 together with non-random inactivation of the non-mutant X chromosome. Southern analysis of DNA from the parents and from hamster-patient somatic cell hybrids containing only the mutant X chromosome revealed that the deletion represented a de novo mutation involving the paternal X chromosome. Methylation-sensitive RFLP analysis of DNA from maternal fibroblasts and lymphocytes showed methylation patterns consistent with random X-inactivation, indicating that the non-random X-inactivation in the patient was not inherited and was likely a direct result of the Xq27.3-q28 deletion. A 15 kb EcoRI junction fragment, identified in patient DNA using IDS cDNA probes, was cloned from a size-selected patient DNA library. Clones containing the deletion junction were restriction mapped and fragments were subcloned and used to isolate normal sequence on either side of the deletion from normal X chromosome libraries. Comparison of the sequences from normal and mutant X chromosome clones straddling the deletion breakpoint showed that the mutation had occurred by recombination between Alu repeats. Screening of YAC contigs containing normal X chromosome sequence from the region of the mutation, using probes from either side of the deletion breakpoint, showed that the deletion was approximately 4 Mb in size. Probing of mutant DNA with 16 STSs distributed throughout the region of the deletion confirmed that the mutation is a simple deletion with no complex rearrangements of islands of retained DNA. A search for sequences at Xq27.3-q28 involved in X chromosome inactivation is in progress.

  14. Methodological issues in observational studies and non-randomized controlled trials in oncology in the era of big data.

    PubMed

    Tanaka, Shiro; Tanaka, Sachiko; Kawakami, Koji

    2015-04-01

    Non-randomized controlled trials, cohort studies and database studies are appealing study designs when there are urgent needs for safety data, outcomes of interest are rare, generalizability is a matter of concern, or randomization is not feasible. This paper reviews four typical case studies from methodological viewpoints and clarifies how to minimize bias in observational studies in oncology. In summary, researchers planning observational studies should be cautious of selection of appropriate databases, validity of algorithms for identifying outcomes, comparison with incident users or self-control, rigorous collection of information on potential confounders and reporting details of subject selection. Further, a careful study protocol and statistical analysis plan are also necessary.

  15. Tables of critical values for examining compositional non-randomness in proteins and nucleic acids

    NASA Technical Reports Server (NTRS)

    Laird, M.; Holmquist, R.

    1975-01-01

    A binomially distributed statistic is defined to show whether or not the proportion of a particular amino acid in a protein deviates from random expectation. An analogous statistic is derived for nucleotides in nucleic acids. These new statistics are simply related to the classical chi-squared test. They explicitly account for the compositional fluctuations imposed by the finite length of proteins, and they are more accurate than previous tables.

  16. Non-random association of opsin alleles in wild groups of red-bellied tamarins (Saguinus labiatus) and maintenance of the colour vision polymorphism.

    PubMed

    Surridge, Alison K; Suárez, Sandra S; Buchanan-Smith, Hannah M; Mundy, Nicholas I

    2005-12-22

    The remarkable X-linked colour vision polymorphism observed in many New World primates is thought to be maintained by balancing selection. Behavioural tests support a hypothesis of heterozygote advantage, as heterozygous females (with trichromatic vision) exhibit foraging benefits over homozygous females and males (with dichromatic vision) when detecting ripe fruit on a background of leaves. Whilst most studies to date have examined the functional relevance of polymorphic colour vision in the context of foraging behaviour, alternative hypotheses proposed to explain the polymorphism have remained unexplored. In this study we examine colour vision polymorphism, social group composition and breeding success in wild red-bellied tamarins Saguinus labiatus. We find that the association of males and females within tamarin social groups is non-random with respect to colour vision genotype, with identified mating partners having the greatest allelic diversity. The observed distribution of alleles may be driven by inbreeding avoidance and implies an important new mechanism for maintaining colour vision polymorphism. This study also provides the first preliminary evidence that wild trichromatic females may have increased fitness compared with dichromatic counterparts, as measured by breeding success and longevity. PMID:17148234

  17. Non-random retention of protein-coding overlapping genes in Metazoa

    PubMed Central

    Soldà, Giulia; Suyama, Mikita; Pelucchi, Paride; Boi, Silvia; Guffanti, Alessandro; Rizzi, Ermanno; Bork, Peer; Tenchini, Maria Luisa; Ciccarelli, Francesca D

    2008-01-01

    Background Although the overlap of transcriptional units occurs frequently in eukaryotic genomes, its evolutionary and biological significance remains largely unclear. Here we report a comparative analysis of overlaps between genes coding for well-annotated proteins in five metazoan genomes (human, mouse, zebrafish, fruit fly and worm). Results For all analyzed species the observed number of overlapping genes is always lower than expected assuming functional neutrality, suggesting that gene overlap is negatively selected. The comparison to the random distribution also shows that retained overlaps do not exhibit random features: antiparallel overlaps are significantly enriched, while overlaps lying on the same strand and those involving coding sequences are highly underrepresented. We confirm that overlap is mostly species-specific and provide evidence that it frequently originates through the acquisition of terminal, non-coding exons. Finally, we show that overlapping genes tend to be significantly co-expressed in a breast cancer cDNA library obtained by 454 deep sequencing, and that different overlap types display different patterns of reciprocal expression. Conclusion Our data suggest that overlap between protein-coding genes is selected against in Metazoa. However, when retained it may be used as a species-specific mechanism for the reciprocal regulation of neighboring genes. The tendency of overlaps to involve non-coding regions of the genes leads to the speculation that the advantages achieved by an overlapping arrangement may be optimized by evolving regulatory non-coding transcripts. PMID:18416813

  18. Non-random co-occurrence of native and exotic plant species in Mediterranean grasslands

    NASA Astrophysics Data System (ADS)

    de Miguel, José M.; Martín-Forés, Irene; Acosta-Gallo, Belén; del Pozo, Alejandro; Ovalle, Carlos; Sánchez-Jardón, Laura; Castro, Isabel; Casado, Miguel A.

    2016-11-01

    Invasion by exotic species in Mediterranean grasslands has determined assembly patterns of native and introduced species, knowledge of which provides information on the ecological processes underlying these novel communities. We considered grasslands from Spain and Chile. For each country we considered the whole grassland community and we split species into two subsets: in Chile, species were classified as natives or colonizers (i.e. exotics); in Spain, species were classified as exclusives (present in Spain but not in Chile) or colonizers (Spanish natives and exotics into Chile). We used null models and co-occurrence indices calculated in each country for each one of 15 sites distributed along a precipitation gradient and subjected to similar silvopastoral exploitation. We compared values of species co-occurrence between countries and between species subsets (natives/colonizers in Chile; exclusives/colonizers in Spain) within each country and we characterised them according to climatic variables. We hypothesized that: a) the different coexistence time of the species in both regions should give rise to communities presenting a spatial pattern further from random in Spain than in Chile, b) the co-occurrence patterns in the grasslands are affected by mesoclimatic factors in both regions. The patterns of co-occurrence are similar in Spain and Chile, mostly showing a spatial pattern more segregated than expected by random. The colonizer species are more segregated in Spain than in Chile, possibly determined by the longer residence time of the species in the source area than in the invaded one. The segregation of species in Chile is related to water availability, being species less segregated in habitat with greater water deficit; in Spain no relationship with climatic variables was found. After an invasion process, our results suggest that the possible process of alteration of the original Chilean communities has not prevented the assembly between the native and

  19. The Origin of Aging: Imperfectness-Driven Non-Random Damage Defines the Aging Process and Control of Lifespan

    PubMed Central

    Gladyshev, Vadim N.

    2013-01-01

    Physico-chemical properties preclude ideal biomolecules and perfect biological functions. This inherent imperfectness leads to the generation of damage by every biological process, at all levels, from small molecules to cells. The damage is too numerous to be repaired, is partially invisible to natural selection and manifests as aging. I propose that it is the inherent imperfectness of biological systems that is the true root of the aging process. As each biomolecule generates specific forms of damage, the cumulative damage is largely non-random and is indirectly encoded in the genome. I consider this concept in light of other proposed theories of aging and integrate these disparate ideas into a single model. I also discuss the evolutionary significance of damage accumulation and strategies for reducing damage. Finally, I suggest ways to test this integrated model of aging. PMID:23769208

  20. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    PubMed

    Shara, Nawar; Yassin, Sayf A; Valaitis, Eduardas; Wang, Hong; Howard, Barbara V; Wang, Wenyu; Lee, Elisa T; Umans, Jason G

    2015-01-01

    Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS). Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991), 2 (1993-1995), and 3 (1998-1999) was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  1. Does balneotherapy with low radon concentration in water influence the endocrine system? A controlled non-randomized pilot study.

    PubMed

    Nagy, Katalin; Berhés, István; Kovács, Tibor; Kávási, Norbert; Somlai, János; Bender, Tamás

    2009-08-01

    Radon bath is a well-established modality of balneotherapy for the management of degenerative musculoskeletal disorders. The present study was conducted to ascertain whether baths of relatively low (80 Bq/l) radon concentration have any influence on the functioning of the endocrine system. In the study, a non-randomized pilot study, 27 patients with degenerative musculoskeletal disorders received 30-min radon baths (of 31-32 degrees C temperature and 80 Bq/l average radon concentration) daily, for 15 days. Twenty-five patients with matching pathologies were subjected to balneotherapy according to the same protocol, using thermal water with negligible radon content (6 Bq/l). Serum thyroid stimulating hormone, prolactin, cortisol, adrenocorticotropic hormone, and dehydroepiandrosterone levels were measured before and after a balneotherapy course of 15 sessions. Comparison of the accumulated data using the Wilcoxon test did not reveal any significant difference between pre- and post-treatment values or between the two patient groups. It is noted that while the beneficial effects of balneotherapy with radon-containing water on degenerative disorders is widely known, only few data have been published in the literature on its effect on endocrine functions. The present study failed to demonstrate any substantial effect of thermal water with relatively low radon content on the functioning of the endocrine system.

  2. A polymerase chain reaction assay for non-random X chromosome inactivation identifies monoclonal endometrial cancers and precancers.

    PubMed

    Mutter, G L; Chaponot, M L; Fletcher, J A

    1995-02-01

    We hypothesize that endometrial carcinoma and their precursors share a monoclonal growth pattern and tested this thesis with archival paraffin-embedded tissues using a polymerase chain reaction-based assay for non-random X chromosome inactivation. Of the 10 well-differentiated endometrial adenocarcinoma cases with heterozygous markers (HUMARA, X-linked androgen receptor gene), 9 had skewed X inactivation consistent with a monoclonal process, and one contained a structurally altered HUMARA gene. X inactivation skewing similar to that of the tumor was seen in matched control polyclonal tissues of 4 (of 9) cases, caused by the small number of endometrial stem cells at the time of embryonic X inactivation. When the polymerase chain reaction assay was applied to four potential endometrial precancers (atypical endometrial hyperplasia) and matched control tissues, two were inconclusive, and two were found to be monoclonal. We conclude that 1) it is essential to include polyclonal control tissues in X inactivation analyses to determine whether skewing is a specific indicator of monoclonality; and 2) endometrial adenocarcinomas and some putative precancers, atypical endometrial hyperplasia, are monoclonal.

  3. Co-occurrence analyses show that non-random community structure is disrupted by fire in two groups of soil arthropods (Isopoda Oniscidea and Collembola)

    NASA Astrophysics Data System (ADS)

    Pitzalis, Monica; Luiselli, Luca; Bologna, Marco A.

    2010-01-01

    In this paper, we tested the hypothesis that natural catastrophes may destroy non-random community structure in natural assemblages of organisms. As a study system, we selected fire as the catastrophic event, and two groups of soil arthropods (Collembola and Isopoda Oniscidea) as target organisms. By co-occurrence analyses and Monte Carlo simulations of niche overlap analysis (C-score, with fixed-equiprobable model; RA2 and RA3 algorithms) we evaluated whether the community structure of these two groups were random/non-random at three unburnt sites and at three neighbour burnt sites that were devastated by a large-scale fire in summer 2000. Both taxa experienced a remarkable reduction in the number of species sampled in burnt versus unburnt sites, but the difference among sites was not statistically significant for Oniscidea. We determined that community structure was clearly non-random at the unburnt sites for both Collembola (according to RA3 algorithm) and Isopoda Oniscidea (according to co-occurrence analysis) and that, as predicted by theory, the catastrophic event did deeply alter the community structure by removing the non-random organization of the species interactions. We also observed a shift from segregation to aggregation/randomness in soil arthropods communities affected by fire, a pattern that was similar to that observed in natural communities of organisms perturbed by the introduction of alien species, thus indicating that this pattern may be generalizable when alteration of communities may occur.

  4. Issues Relating to Study Design and Risk of Bias When Including Non-Randomized Studies in Systematic Reviews on the Effects of Interventions

    ERIC Educational Resources Information Center

    Higgins, Julian P. T.; Ramsay, Craig; Reeves, Barnaby C.; Deeks, Jonathan J.; Shea, Beverley; Valentine, Jeffrey C.; Tugwell, Peter; Wells, George

    2013-01-01

    Non-randomized studies may provide valuable evidence on the effects of interventions. They are the main source of evidence on the intended effects of some types of interventions and often provide the only evidence about the effects of interventions on long-term outcomes, rare events or adverse effects. Therefore, systematic reviews on the effects…

  5. Issues Relating to Confounding and Meta-analysis When Including Non-Randomized Studies in Systematic Reviews on the Effects of Interventions

    ERIC Educational Resources Information Center

    Valentine, Jeffrey C.; Thompson, Simon G.

    2013-01-01

    Background: Confounding caused by selection bias is often a key difference between non-randomized studies (NRS) and randomized controlled trials (RCTs) of interventions. Key methodological issues: In this third paper of the series, we consider issues relating to the inclusion of NRS in systematic reviews on the effects of interventions. We discuss…

  6. Behavioral approach with or without surgical intervention to the vulvar vestibulitis syndrome: a prospective randomized and non-randomized study.

    PubMed

    Weijmar Schultz, W C; Gianotten, W L; van der Meijden, W I; van de Wiel, H B; Blindeman, L; Chadha, S; Drogendijk, A C

    1996-09-01

    This article describes the outcome of a behavioral approach with or without preceding surgical intervention in 48 women with the vulvar vestibulitis syndrome. In the first part of the study, 14 women with the vulvar vestibulitis syndrome were randomly assigned to one of two treatment programs: either a behavioral approach or a behavioral approach preceded by surgery. In the second part of the study, 34 women and their partners were given a choice of treatment. Follow-up data were gathered a mean of 3 and 2 1/2 years after treatment, respectively. In the randomized patient population, the intervention had a positive effect on all of them: the complaints disappeared, diminished or did not change but formed less of a problem. The difference in outcome between the two different treatments, a behavioral approach with or without preceding surgery, was not statistically significant. In the second non-randomized part of the study, 28 out of the 34 women (82%) chose the behavioral approach without preceding surgery. The difference in outcome between the two treatments was not statistically significant. Two out of the 28 women who chose behavioral treatment without preceding surgery had to be referred for psychiatric consultation because of serious psycho-sexual problems. In one woman, psychiatric treatment was successful. Three other women, whose behavioral treatment failed, underwent additional surgery, which clearly helped them to overcome the deadlock in the behavioral approach. The behavioral approach should be the first choice of treatment for the vulvar vestibulitis syndrome. Surgical intervention should be considered as an additional form of treatment in some cases with the vulvar vestibulitis syndrome to facilitate breaking the vicious circle of irritation, pelvic floor muscle hypertonia and sexual maladaptive behavior.

  7. A school intervention for mental health literacy in adolescents: effects of a non-randomized cluster controlled trial

    PubMed Central

    2013-01-01

    Background “Mental health for everyone” is a school program for mental health literacy and prevention aimed at secondary schools (13–15 yrs). The main aim was to investigate whether mental health literacy, could be improved by a 3-days universal education programme by: a) improving naming of symptom profiles of mental disorder, b) reducing prejudiced beliefs, and c) improving knowledge about where to seek help for mental health problems. A secondary aim was to investigate whether adolescent sex and age influenced the above mentioned variables. A third aim was to investigate whether prejudiced beliefs influenced knowledge about available help. Method This non-randomized cluster controlled trial included 1070 adolescents (53.9% boys, M age14 yrs) from three schools in a Norwegian town. One school (n = 520) received the intervention, and two schools (n = 550) formed the control group. Pre-test and follow-up were three months apart. Linear mixed models and generalized estimating equations models were employed for analysis. Results Mental health literacy improved contingent on the intervention, and there was a shift towards suggesting primary health care as a place to seek help. Those with more prejudiced beleifs did not suggest places to seek help for mental health problems. Generally, girls and older adolescents recognized symptom profiles better and had lower levels of prejudiced beliefs. Conclusions A low cost general school program may improve mental health literacy in adolescents. Gender specific programs and attention to the age and maturity of the students should be considered when mental health literacy programmes are designed and tried out. Prejudice should be addressed before imparting information about mental health issues. PMID:24053381

  8. Effectiveness of a 'Global Postural Reeducation' program for persistent Low Back Pain: a non-randomized controlled trial

    PubMed Central

    2010-01-01

    Background The aim of this non-randomized controlled trial was to evaluate the effectiveness of a Global Postural Reeducation (GPR) program as compared to a Stabilization Exercise (SE) program in subjects with persistent low back pain (LBP) at short- and mid-term follow-up (ie. 3 and 6 months). Methods According to inclusion and exclusion criteria, 100 patients with a primary complaint of persistent LBP were enrolled in the study: 50 were allocated to the GPR group and 50 to the SE group. Primary outcome measures were Roland and Morris Disability Questionnaire (RMDQ) and Oswestry Disability Index (ODI). Secondary outcome measures were lumbar Visual Analogue Scale (VAS) and Fingertip-to-floor test (FFT). Data were collected at baseline and at 3/6 months by health care professionals unaware of the study. An intention to treat approach was used to analyze participants according to the group to which they were originally assigned. Results Of the 100 patients initially included in the study, 78 patients completed the study: 42 in the GPR group and 36 in the SE group. At baseline, the two groups did not differ significantly with respect to gender, age, BMI and outcome measures. Comparing the differences between groups at short- and mid-term follow-up, the GPR group revealed a significant reduction (from baseline) in all outcome measures with respect to the SE group. The ordered logistic regression model showed an increased likelihood of definitive improvement (reduction from baseline of at least 30% in RMDQ and VAS scores) for the GPR group compared to the SE group (OR 3.9, 95% CI 2.7 to 5.7). Conclusions Our findings suggest that a GPR intervention in subjects with persistent LBP induces a greater improvement on pain and disability as compared to a SE program. These results must be confirmed by further studies with higher methodological standards, including randomization, larger sample size, longer follow-up and subgrouping of the LBP subjects. Trial registration NCT

  9. Non-random pre-transcriptional evolution in HIV-1. A refutation of the foundational conditions for neutral evolution

    PubMed Central

    2009-01-01

    The complete base sequence of HIV-1 virus and GP120 ENV gene were analyzed to establish their distance to the expected neutral random sequence. An especial methodology was devised to achieve this aim. Analyses included: a) proportion of dinucleotides (signatures); b) homogeneity in the distribution of dinucleotides and bases (isochores) by dividing both segments in ten and three sub-segments, respectively; c) probability of runs of bases and No-bases according to the Bose-Einstein distribution. The analyses showed a huge deviation from the random distribution expected from neutral evolution and neutral-neighbor influence of nucleotide sites. The most significant result is the tremendous lack of CG dinucleotides (p < 10-50 ), a selective trait of eukaryote and not of single stranded RNA virus genomes. Results not only refute neutral evolution and neutral neighbor influence, but also strongly indicate that any base at any nucleotide site correlates with all the viral genome or sub-segments. These results suggest that evolution of HIV-1 is pan-selective rather than neutral or nearly neutral. PMID:21637663

  10. Analysis of emplaced waste data and implications of non-random emplacement for performance assessment for the WIPP

    SciTech Connect

    Allen, Lawrence E.; Channell, James K.

    2003-05-31

    The WIPP Land Withdrawal Act recognized that after the initial certification of the WIPP and start of disposal operations, operating experience and ongoing research would result in new technical and scientific information. The Environmental Evaluation Group (EEG) has previously reported on issues that it considers important as the Department of Energy (DOE) works towards the first recertification. One of these issues involves the assumption of random emplacement of waste used in the performance assessment calculations in support of the initial certification application. As actual waste emplacement data are now available from four years of disposal, the EEG performed an analysis to evaluate the validity of that initial assumption and determine implications for performance assessment. Panel 1 was closed in March 2003. The degree of deviation between actual emplaced waste in Panel 1 and an assumption of random emplacement is apparent with concentrations of 239Pu being 3.20 times, 240Pu being 2.67 times, and 241Am being 4.13 times the projected repository average for the space occupied by the waste. A spatial statistical analysis was performed using available Panel 1 data retrieved from the WWIS and assigned room coordinates by Sandia National Laboratories. A comparison was made between the waste as emplaced and a randomization of the same waste. Conversely, the distribution of waste as emplaced is similar to the distribution of waste in the individual containers and can be characterized as bi-modal and skewed with a long high-concentration tail. The distribution of randomized waste is fairly symmetrical, as would be expected from classical statistical theory. In the event of a future drilling intrusion, comparison of these two distributions shows a higher probability of intersecting a high-concentration stack of the actual emplaced waste, over that of the same waste emplaced in a randomized manner as was assumed in the certified

  11. Inbreeding and purging at the genomic Level: the Chillingham cattle reveal extensive, non-random SNP heterozygosity.

    PubMed

    Williams, J L; Hall, S J G; Del Corvo, M; Ballingall, K T; Colli, L; Ajmone Marsan, P; Biscarini, F

    2016-02-01

    Local breeds of livestock are of conservation significance as components of global biodiversity and as reservoirs of genetic variation relevant to the future sustainability of agriculture. One such rare historic breed, the Chillingham cattle of northern England, has a 350-year history of isolation and inbreeding yet shows no diminution of viability or fertility. The Chillingham cattle have not been subjected to selective breeding. It has been suggested previously that the herd has minimal genetic variation. In this study, high-density SNP genotyping with the 777K SNP chip showed that 9.1% of loci on the chip are polymorphic in the herd, compared with 62-90% seen in commercial cattle breeds. Instead of being homogeneously distributed along the genome, these loci are clustered at specific chromosomal locations. A high proportion of the Chillingham individuals examined were heterozygous at many of these polymorphic loci, suggesting that some loci are under balancing selection. Some of these frequently heterozygous loci have been implicated as sites of recessive lethal mutations in cattle. Linkage disequilibrium equal or close to 100% was found to span up to 1350 kb, and LD was above r(2) = 0.25 up to more than 5000 kb. This strong LD is consistent with the lack of polymorphic loci in the herd. The heterozygous regions in the Chillingham cattle may be the locations of genes relevant to fitness or survival, which may help elucidate the biology of local adaptation in traditional breeds and facilitate selection for such traits in commercial cattle.

  12. Inbreeding and purging at the genomic Level: the Chillingham cattle reveal extensive, non-random SNP heterozygosity.

    PubMed

    Williams, J L; Hall, S J G; Del Corvo, M; Ballingall, K T; Colli, L; Ajmone Marsan, P; Biscarini, F

    2016-02-01

    Local breeds of livestock are of conservation significance as components of global biodiversity and as reservoirs of genetic variation relevant to the future sustainability of agriculture. One such rare historic breed, the Chillingham cattle of northern England, has a 350-year history of isolation and inbreeding yet shows no diminution of viability or fertility. The Chillingham cattle have not been subjected to selective breeding. It has been suggested previously that the herd has minimal genetic variation. In this study, high-density SNP genotyping with the 777K SNP chip showed that 9.1% of loci on the chip are polymorphic in the herd, compared with 62-90% seen in commercial cattle breeds. Instead of being homogeneously distributed along the genome, these loci are clustered at specific chromosomal locations. A high proportion of the Chillingham individuals examined were heterozygous at many of these polymorphic loci, suggesting that some loci are under balancing selection. Some of these frequently heterozygous loci have been implicated as sites of recessive lethal mutations in cattle. Linkage disequilibrium equal or close to 100% was found to span up to 1350 kb, and LD was above r(2) = 0.25 up to more than 5000 kb. This strong LD is consistent with the lack of polymorphic loci in the herd. The heterozygous regions in the Chillingham cattle may be the locations of genes relevant to fitness or survival, which may help elucidate the biology of local adaptation in traditional breeds and facilitate selection for such traits in commercial cattle. PMID:26559490

  13. The Self-Made Puzzle: Integrating Self-Assembly and Pattern Formation Under Non-Random Genetic Regulation

    NASA Astrophysics Data System (ADS)

    Doursat, René

    On the one hand, research in self-assembling systems, whether natural or artificial, has traditionally focused on pre-existing components endowed with fixed shapes. Biological development, by contrast, dynamically creates new cells that acquire selective adhesion properties through differentiation induced by their neighborhood. On the other hand, pattern formation phenomena are generally construed as orderly states of activity on top of a continuous 2-D or 3-D substrate. Yet, again, the spontaneous patterning of an organism into domains of gene expression arises within a multicellular medium in perpetual expansion and reshaping. Finally, both phenomena are often thought in terms of stochastic events, whether mixed components that randomly collide in self-assembly, or spots and stripes that occur unpredictably from instabilities in pattern formation. Here too, these notions need significant revision if they are to be extended and applied to embryogenesis. Cells are not randomly mixed but pre-positioned where cell division occurs. Genetic identity domains are not randomly distributed but highly regulated in number and position. In this work, I present a computational model of program-mable and reproducible artificial morphogenesis that integrates self-assembly and pattern formation under the control of a nonrandom gene regulatory network. The specialized properties of cells (division, adhesion, migration) are determined by the gene expression domains to which they belong, while at the same time these domains further expand and segment into subdomains due to the self-assembly of specialized cells. Through this model, I also promote a new discipline, embryomorphic engineering to solve the paradox of "meta-designing" decentralized, autonomous systems.

  14. Design and baseline findings of a multi-site non-randomized evaluation of the effect of a health programme on microfinance clients in India.

    PubMed

    Saha, Somen

    2014-01-01

    Microfinance is the provision of financial services for the poor. Health program through microfinance has the potential to address several access barriers to health. We report the design and baseline findings of a multi-site non-randomized evaluation of the effect of a health program on the members of two microfinance organizations from Karnataka and Gujarat states of India. Villages identified for roll-out of health services with microfinance were pair-matched with microfinance only villages. A quantitative survey at inception and twelve months post health intervention compare the primary outcome (incidence of childhood diarrhea), and secondary outcome (place of last delivery, toilet at home, and out-of-pocket expenditure on treatment). At baseline, the intervention and comparison communities were similar except for out-of-pocket expenditure on health. Low reported use of toilet at home indicates the areas are heading towards a sanitation crisis. This should be an area of program priority for the microfinance organizations. While respondents primarily rely on their savings for meeting treatment expenditure, borrowing from friends, relatives, and money-lenders remains other important source of meeting treatment expenditure in the community. Programs need to prioritize steps to ensure awareness about national health insurance schemes, entitlement to increase service utilization, and developing additional health financing safety nets for financing outpatient care, that are responsible for majority of health-debt. Finally we discuss implications of such programs for national policy makers. PMID:24373263

  15. A non-randomized confirmatory study regarding selection of fertility-sparing surgery for patients with epithelial ovarian cancer: Japan Clinical Oncology Group Study (JCOG1203).

    PubMed

    Satoh, Toyomi; Tsuda, Hitoshi; Kanato, Keisuke; Nakamura, Kenichi; Shibata, Taro; Takano, Masashi; Baba, Tsukasa; Ishikawa, Mitsuya; Ushijima, Kimio; Yaegashi, Nobuo; Yoshikawa, Hiroyuki

    2015-06-01

    Fertility-sparing treatment has been accepted as a standard treatment for epithelial ovarian cancer in stage IA non-clear cell histology grade 1/grade 2. In order to expand an indication of fertility-sparing treatment, we have started a non-randomized confirmatory trial for stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The protocol-defined fertility-sparing surgery is optimal staging laparotomy including unilateral salpingo-oophorectomy, omentectomy, peritoneal cytology and pelvic and para-aortic lymph node dissection or biopsy. After fertility-sparing surgery, four to six cycles of adjuvant chemotherapy with paclitaxel and carboplatin are administered. We plan to enroll 250 patients with an indication of fertility-sparing surgery, and then the primary analysis is to be conducted for 63 operated patients with pathologically confirmed stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The primary endpoint is 5-year overall survival. Secondary endpoints are other survival endpoints and factors related to reproduction. This trial has been registered at the UMIN Clinical Trials Registry as UMIN000013380. PMID:26059697

  16. Design and Baseline Findings of a Multi-site Non-randomized Evaluation of the Effect of a Health Programme on Microfinance Clients in India

    PubMed Central

    Saha, Somen

    2014-01-01

    Microfinance is the provision of financial services for the poor. Health program through microfinance has the potential to address several access barriers to health. We report the design and baseline findings of a multi-site non-randomized evaluation of the effect of a health program on the members of two microfinance organizations from Karnataka and Gujarat states of India. Villages identified for roll-out of health services with microfinance were pair-matched with microfinance only villages. A quantitative survey at inception and twelve months post health intervention compare the primary outcome (incidence of childhood diarrhea), and secondary outcome (place of last delivery, toilet at home, and out-of-pocket expenditure on treatment). At baseline, the intervention and comparison communities were similar except for out-of-pocket expenditure on health. Low reported use of toilet at home indicates the areas are heading towards a sanitation crisis. This should be an area of program priority for the microfinance organizations. While respondents primarily rely on their savings for meeting treatment expenditure, borrowing from friends, relatives, and money-lenders remains other important source of meeting treatment expenditure in the community. Programs need to prioritize steps to ensure awareness about national health insurance schemes, entitlement to increase service utilization, and developing additional health financing safety nets for financing outpatient care, that are responsible for majority of health-debt. Finally we discuss implications of such programs for national policy makers. PMID:24373263

  17. A non-randomized confirmatory study regarding selection of fertility-sparing surgery for patients with epithelial ovarian cancer: Japan Clinical Oncology Group Study (JCOG1203).

    PubMed

    Satoh, Toyomi; Tsuda, Hitoshi; Kanato, Keisuke; Nakamura, Kenichi; Shibata, Taro; Takano, Masashi; Baba, Tsukasa; Ishikawa, Mitsuya; Ushijima, Kimio; Yaegashi, Nobuo; Yoshikawa, Hiroyuki

    2015-06-01

    Fertility-sparing treatment has been accepted as a standard treatment for epithelial ovarian cancer in stage IA non-clear cell histology grade 1/grade 2. In order to expand an indication of fertility-sparing treatment, we have started a non-randomized confirmatory trial for stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The protocol-defined fertility-sparing surgery is optimal staging laparotomy including unilateral salpingo-oophorectomy, omentectomy, peritoneal cytology and pelvic and para-aortic lymph node dissection or biopsy. After fertility-sparing surgery, four to six cycles of adjuvant chemotherapy with paclitaxel and carboplatin are administered. We plan to enroll 250 patients with an indication of fertility-sparing surgery, and then the primary analysis is to be conducted for 63 operated patients with pathologically confirmed stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The primary endpoint is 5-year overall survival. Secondary endpoints are other survival endpoints and factors related to reproduction. This trial has been registered at the UMIN Clinical Trials Registry as UMIN000013380.

  18. Non-random association between alleles detected at D4S95 and D4S98 and the Huntington's disease gene.

    PubMed Central

    Theilmann, J; Kanani, S; Shiang, R; Robbins, C; Quarrell, O; Huggins, M; Hedrick, A; Weber, B; Collins, C; Wasmuth, J J

    1989-01-01

    Analysis of many families with linked DNA markers has provided support for the Huntington's disease (HD) gene being close to the telomere on the short arm of chromosome 4. However, analysis of recombination events in particular families has provided conflicting results about the precise location of the HD gene relative to these closely linked DNA markers. Here we report an investigation of linkage disequilibrium between six DNA markers and the HD gene in 75 separate families of varied ancestry. We show significant non-random association between alleles detected at D4S95 and D4S98 and the mutant gene. These data suggest that it may be possible to construct high and low risk haplotypes, which may be helpful in DNA analysis and genetic counselling for HD, and represent independent evidence that the gene for HD is centromeric to more distally located DNA markers such as D4S90. This information may be helpful in defining a strategy to clone the gene for HD based on its location in the human genome. Images PMID:2531224

  19. Novel gene arrangement in the mitochondrial genome of Bothus myriaster (Pleuronectiformes: Bothidae): evidence for the Dimer-Mitogenome and Non-random Loss model.

    PubMed

    Gong, Li; Shi, Wei; Yang, Min; Li, Donghe; Kong, Xiaoyu

    2016-09-01

    In the present study, the complete mitochondrial genome of the oval flounder Bothus myriaster was determined and a novel gene rearrangement was discovered. A striking finding was that eight genes encoded by the L-strand (ND6 and tRNA-Q, A, C, Y, S1, E, and P genes) were translocated to a position between tRNA-T and tRNA-F. Simultaneously, the original order of the rearranged genes Q-A-C-Y-S1-ND6-E-P was maintained. Thus, the genes with identical transcriptional polarities were clustered in the genome with one exception that tRNA-N gene is located in the original position. The other rearrangement was the H-strand-encoded tRNA-D that translocated from its typical location between COI and COII to a position between S1 and ND6. The Dimer-Mitogenome and Non-random Loss model (DMNL)was adopted to account for the novel rearrangement in B. myriaster mitogenome, which provides evidence to support the hypothesis of the DMNL model.

  20. Brachytherapy versus radical hysterectomy after external beam chemoradiation: a non-randomized matched comparison in IB2-IIB cervical cancer patients

    PubMed Central

    Cetina, Lucely; Garcia-Arias, Alicia; Candelaria, Myrna; Cantú, David; Rivera, Lesbia; Coronel, Jaime; Bazan-Perkins, Blanca; Flores, Vladimir; Gonzalez, Aaron; Dueñas-González, Alfonso

    2009-01-01

    Background A current paradigm in the treatment of cervical cancer with radiation therapy is that intracavitary brachytherapy is an essential component of radical treatment. This is a matched retrospective comparison of the results of treatment in patients treated with external beam chemoradiation (EBRT-CT) and radical hysterectomy versus those treated with identical chemoradiation followed by brachytherapy. Methods In this non-randomized comparison EBRT-CT protocol was the same in both groups of 40 patients. In the standard treated patients, EBRT-CT was followed by one or two intracavitary Cesium (low-dose rate) applications within 2 weeks of finishing external radiation to reach a point A dose of at least 85 Gy. In the surgically treated patients, radical hysterectomy with bilateral pelvic lymph node dissection and para-aortic lymph node sampling were performed within 7 weeks after EBRT-CT. Response, toxicity and survival were evaluated. Results A total of 80 patients were analyzed. The patients receiving EBRT-CT and surgery were matched with the standard treated cases. There were no differences in the clinicopathological characteristics between groups or in the delivery of EBRT-CT. The pattern of acute and late toxicity differed. Standard treated patients had more chronic proctitis while the surgically treated had acute complications of surgery and hydronephrosis. At a maximum follow-up of 60 months, median follow-up 26 (2–31) and 22 (3–27) months for the surgery and standard therapy respectively, eight patients per group have recurred and died. The progression free and overall survival are the same in both groups. Conclusion The results of this study suggest that radical hysterectomy can be used after EBRT-CT without compromising survival in FIGO stage IB2-IIB cervical cancer patients in settings were brachytherapy is not available. A randomized study is needed to uncover the value of surgery after EBRT-CT. PMID:19220882

  1. Evaluation of the Efficacy and Safety of Short-Course Deep Sedation Therapy for the Treatment of Intracerebral Hemorrhage After Surgery: A Non-Randomized Control Study

    PubMed Central

    Hou, Dapeng; Liu, Beibei; Zhang, Juan; Wang, Qiushi; Zheng, Wei

    2016-01-01

    Background While mild and moderate sedation have been widely used to reduce sudden agitation in intracerebral hemorrhage (ICH) patients after surgery, agitation is still a frequent problem, which may cause postoperative blood pressure fluctuation. The present study aimed to evaluate the efficacy and safety of short-course deep sedation for the treatment of ICH after surgery. Material/Methods A total of 41 ICH patients who received surgery, including traditional craniotomy hematoma removal and decompressive craniectomy, were including in this non-randomized control study. Patients in the deep sedation group received continuous postoperative sedation with a target course for ≤12 hours and reached SAS scores of 1~2. Patients in the traditional sedition group received continuous light sedation and reached SAS scores of 3~4. Additional therapeutic interventions included antihypertensive treatment, mechanical ventilation, tracheotomy, and re-operation. Results Patients in the deep sedation group had deeper sedation degree, and lower systolic blood pressure (SBP) and diastolic blood pressure (DBP). Residual hematoma after surgery in patients in the deep sedation group were smaller on the second, seventh, and fourteenth day after surgery (p=0.023, 0.003, 0.004, respectively). The 3-month mortality and quality of life of patients in the deep sedation group were lower and better than that of patients in the traditional sedation group, respectively (p=0.044, p<0.01). No significant difference in the incidence of ventilator-associated pneumonia (VAP) and ICU days were observed between the two groups. Conclusions Short-course deep sedation therapy in ICH patients after surgery is efficient in controlling postoperative blood pressure, reducing re-bleeding, and improving clinical prognosis. PMID:27466863

  2. Effectiveness of a peer-led HIV prevention intervention in secondary schools in Rwanda: results from a non-randomized controlled trial

    PubMed Central

    2012-01-01

    Background While the HIV epidemic is levelling off in sub-Saharan Africa, it remains at an unacceptably high level. Young people aged 15-24 years remain particularly vulnerable, resulting in a regional HIV prevalence of 1.4% in young men and 3.3% in young women. This study assesses the effectiveness of a peer-led HIV prevention intervention in secondary schools in Rwanda on young people’s sexual behavior, HIV knowledge and attitudes. Methods In a non-randomized longitudinal controlled trial, fourteen schools were selected in two neighboring districts in Rwanda Bugesera (intervention) and Rwamagana (control). Students (n = 1950) in eight intervention and six control schools participated in three surveys (baseline, six and twelve months in the intervention). Analysis was done using linear and logistic regression using generalized estimation equations adjusted for propensity score. Results The overall retention rate was 72%. Time trends in sexual risk behavior (being sexually active, sex in last six months, condom use at last sex) were not significantly different in students from intervention and control schools, nor was the intervention associated with increased knowledge, perceived severity or perceived susceptibility. It did significantly reduce reported stigma. Conclusions Analyzing this and other interventions, we identified several reasons for the observed limited effectiveness of peer education: 1) intervention activities (spreading information) are not tuned to objectives (changing behavior); 2) young people prefer receiving HIV information from other sources than peers; 3) outcome indicators are not adequate and the context of the relationship in which sex occurs and the context in which sex occurs is ignored. Effectiveness of peer education may increase through integration in holistic interventions and redefining peer educators’ role as focal points for sensitization and referral to experts and services. Finally, we argue that a narrow focus on

  3. Evaluation of the Efficacy and Safety of Short-Course Deep Sedation Therapy for the Treatment of Intracerebral Hemorrhage After Surgery: A Non-Randomized Control Study.

    PubMed

    Hou, Dapeng; Liu, Beibei; Zhang, Juan; Wang, Qiushi; Zheng, Wei

    2016-01-01

    BACKGROUND While mild and moderate sedation have been widely used to reduce sudden agitation in intracerebral hemorrhage (ICH) patients after surgery, agitation is still a frequent problem, which may cause postoperative blood pressure fluctuation. The present study aimed to evaluate the efficacy and safety of short-course deep sedation for the treatment of ICH after surgery. MATERIAL AND METHODS A total of 41 ICH patients who received surgery, including traditional craniotomy hematoma removal and decompressive craniectomy, were including in this non-randomized control study. Patients in the deep sedation group received continuous postoperative sedation with a target course for ≤12 hours and reached SAS scores of 1~2. Patients in the traditional sedition group received continuous light sedation and reached SAS scores of 3~4. Additional therapeutic interventions included antihypertensive treatment, mechanical ventilation, tracheotomy, and re-operation. RESULTS Patients in the deep sedation group had deeper sedation degree, and lower systolic blood pressure (SBP) and diastolic blood pressure (DBP). Residual hematoma after surgery in patients in the deep sedation group were smaller on the second, seventh, and fourteenth day after surgery (p=0.023, 0.003, 0.004, respectively). The 3-month mortality and quality of life of patients in the deep sedation group were lower and better than that of patients in the traditional sedation group, respectively (p=0.044, p<0.01). No significant difference in the incidence of ventilator-associated pneumonia (VAP) and ICU days were observed between the two groups. CONCLUSIONS Short-course deep sedation therapy in ICH patients after surgery is efficient in controlling postoperative blood pressure, reducing re-bleeding, and improving clinical prognosis.

  4. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  5. Characterization of intra-molecular distances and site-specific dynamics in chemically unfolded barstar: evidence for denaturant-dependent non-random structure.

    PubMed

    Saxena, Anoop M; Udgaonkar, Jayant B; Krishnamoorthy, G

    2006-05-26

    The structure and dynamics of the unfolded form of a protein are expected to play critical roles in determining folding pathways. In this study, the urea and guanidine hydrochloride (GdnHCl)-unfolded forms of the small protein barstar were explored by time-resolved fluorescence techniques. Barstar was labeled specifically with thionitrobenzoate (TNB), by coupling it to the thiol side-chain of a cysteine residue at one of the following positions on the sequence: 14, 25, 40, 42, 62, 82 and 89, in single cysteine-containing mutant proteins. Seven intra-molecular distances (R(DA)) under unfolding conditions were estimated from measurements of time-resolved fluorescence resonance energy transfer between the donor Trp53 and the non-fluorescent acceptor TNB coupled to one of the seven cysteine side-chains. The unfolded protein chain expands with an increase in the concentration of the denaturants. The extent of expansion was found to be non-uniform, with different intra-molecular distances expanding to different extents. In general, shorter distances were found to expand less when compared to longer spans. The extent of expansion was higher in the case of GdnHCl when compared to urea. A comparison of the measured values of R(DA) with those derived from a model based on excluded volume, revealed that while shorter spans showed good agreement, the experimental values of R(DA) of longer spans were smaller when compared to the theoretical values. Sequence-specific flexibility of the polypeptide was determined by time-resolved fluorescence anisotropy decay measurements on acrylodan or 1,5-IAEDANS labeled single cysteine-containing proteins under unfolding conditions. Rotational dynamics derived from these measurements indicated that the level of flexibility increased with increase in the concentration of denaturants and showed a graded increase towards the C-terminal end. Taken together, these results appear to indicate the presence of specific non-random coil structures and

  6. Non-random autosome segregation: a stepping stone for the evolution of sex chromosome complexes? Sex-biased transmission of autosomes could facilitate the spread of antagonistic alleles, and generate sex-chromosome systems with multiple X or Y chromosomes.

    PubMed

    Schwander, Tanja; Beukeboom, Leo W

    2011-02-01

    A new study in Caenorhabditis elegans shows that homologous autosomes segregate non-randomly with the sex chromosome in the heterogametic sex. Segregation occurs according to size, small autosomes segregating with, and large autosomes segregating away from the X-chromosome. Such sex-biased transmission of autosomes could facilitate the spread of sexually antagonistic alleles whose effects favor the fitness of one sex at the expense of the other. This may provide a first step toward the evolution of new sex determination systems.

  7. On Barnes Beta Distributions and Applications to the Maximum Distribution of the 2D Gaussian Free Field

    NASA Astrophysics Data System (ADS)

    Ostrovsky, Dmitry

    2016-09-01

    A new family of Barnes beta distributions on (0, ∞) is introduced and its infinite divisibility, moment determinacy, scaling, and factorization properties are established. The Morris integral probability distribution is constructed from Barnes beta distributions of types (1, 0) and (2, 2), and its moment determinacy and involution invariance properties are established. For application, the maximum distributions of the 2D gaussian free field on the unit interval and circle with a non-random logarithmic potential are conjecturally related to the critical Selberg and Morris integral probability distributions, respectively, and expressed in terms of sums of Barnes beta distributions of types (1, 0) and (2, 2).

  8. Non-random pairing of CD46 isoforms with skewing towards BC2 and C2 in activated and memory/effector T cells

    PubMed Central

    Hansen, Aida S.; Bundgaard, Bettina B.; Møller, Bjarne K.; Höllsberg, Per

    2016-01-01

    CD46 is a glycoprotein with important functions in innate and adaptive immune responses. Functionally different isoforms are generated by alternative splicing at exons 7–9 (BC and C isoforms) and exon 13 (CYT-1 and CYT-2 isoforms) giving rise to BC1, BC2, C1 and C2. We developed a novel real-time PCR assay that allows quantitative comparisons between these isoforms. Their relative frequency in CD4+ T cells from 100 donors revealed a distribution with high interpersonally variability. Importantly, the distribution between the isoforms was not random and although splicing favoured inclusion of exon 8 (BC isoforms), exclusion of exon 8 (C isoforms) was significantly linked to exclusion of exon 13 (CYT-2 isoforms). Despite inter-individual differences, CD4+ and CD8+ T cells, B cells, NK cells and monocytes expressed similar isoform profiles intra-individually. However, memory/effector CD4+ T cells had a significantly higher frequency of CYT-2 when compared with naïve CD4+ T cells. Likewise, in vitro activation of naïve and total CD4+ T cells increased the expression of CYT-2. This indicates that although splicing factors determine a certain expression profile in an individual, the profile can be modulated by external stimuli. This suggests a mechanism by which alterations in CD46 isoforms may temporarily regulate the immune response. PMID:27739531

  9. Non-Randomized Confirmatory Trial of Laparoscopy-Assisted Total Gastrectomy and Proximal Gastrectomy with Nodal Dissection for Clinical Stage I Gastric Cancer: Japan Clinical Oncology Group Study JCOG1401

    PubMed Central

    Kataoka, Kozo; Mizusawa, Junki; Katayama, Hiroshi; Nakamura, Kenichi; Morita, Shinji; Yoshikawa, Takaki; Ito, Seiji; Kinoshita, Takahiro; Fukagawa, Takeo; Sasako, Mitsuru

    2016-01-01

    Several prospective studies on laparoscopy-assisted distal gastrectomy for early gastric cancer have been initiated, but no prospective study evaluating laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy has been completed to date. A non-randomized confirmatory trial was commenced in April 2015 to evaluate the safety of laparoscopy-assisted total gastrectomy and laparoscopy-assisted proximal gastrectomy for clinical stage I gastric cancer. A total of 245 patients will be accrued from 42 Japanese institutions over 3 years. The primary endpoint is the proportion of patients with anastomotic leakage. The secondary endpoints are overall survival, relapse-free survival, proportion of patients with completed laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy, proportion of patients with conversion to open surgery, adverse events, and short-term clinical outcomes. The UMIN Clinical Trials Registry number is UMIN000017155. PMID:27433394

  10. Risk of Bias in Systematic Reviews of Non-Randomized Studies of Adverse Cardiovascular Effects of Thiazolidinediones and Cyclooxygenase-2 Inhibitors: Application of a New Cochrane Risk of Bias Tool

    PubMed Central

    Bilandzic, Anja; Fitzpatrick, Tiffany; Rosella, Laura; Henry, David

    2016-01-01

    Background Systematic reviews of the effects of healthcare interventions frequently include non-randomized studies. These are subject to confounding and a range of other biases that are seldom considered in detail when synthesizing and interpreting the results. Our aims were to assess the reliability and usability of a new Cochrane risk of bias (RoB) tool for non-randomized studies of interventions and to determine whether restricting analysis to studies with low or moderate RoB made a material difference to the results of the reviews. Methods and Findings We selected two systematic reviews of population-based, controlled non-randomized studies of the relationship between the use of thiazolidinediones (TZDs) and cyclooxygenase-2 (COX-2) inhibitors and major cardiovascular events. Two epidemiologists applied the Cochrane RoB tool and made assessments across the seven specified domains of bias for each of 37 component studies. Inter-rater agreement was measured using the weighted Kappa statistic. We grouped studies according to overall RoB and performed statistical pooling for (a) all studies and (b) only studies with low or moderate RoB. Kappa scores across the seven bias domains ranged from 0.50 to 1.0. In the COX-2 inhibitor review, two studies had low overall RoB, 14 had moderate RoB, and five had serious RoB. In the TZD review, six studies had low RoB, four had moderate RoB, four had serious RoB, and two had critical RoB. The pooled odds ratios for myocardial infarction, heart failure, and death for rosiglitazone versus pioglitazone remained significantly elevated when analyses were confined to studies with low or moderate RoB. However, the estimate for myocardial infarction declined from 1.14 (95% CI 1.07–1.24) to 1.06 (95% CI 0.99–1.13) when analysis was confined to studies with low RoB. Estimates of pooled relative risks of cardiovascular events with COX-2 inhibitors compared with no nonsteroidal anti-inflammatory drug changed little when analyses were

  11. Short-term intake of a Japanese-style healthy lunch menu contributes to prevention and/or improvement in metabolic syndrome among middle-aged men: a non-randomized controlled trial

    PubMed Central

    2014-01-01

    Background Metabolic syndrome is now widely appreciated as a cluster of metabolic abnormalities such as visceral obesity, hypertension, hyperglycemia and dyslipidemia. To date, incidence of metabolic syndrome is continuously increasing worldwide. In addition, low vegetable consumption has recently become a serious issue in Japan. Furthermore, Japan is facing a shortfall in places offering food that can help prevent metabolic syndrome in the first place. Our study is designed to influence these developments. We conducted a non-randomized controlled trial by offering a Japanese-style healthy lunch menu to middle-aged men in a workplace cafeteria. This menu was designed to prevent and reduce metabolic syndrome. Methods This intervention study took the form of a non-randomized controlled trial. Participants chose the control or intervention group. The control group consumed their habitual lunches without restriction and only nutrient contents were assessed. The intervention group received a Japanese-style healthy lunch at a workplace cafeteria for 3 months. The participants worked in offices at a city hall and mostly had low levels of physical activity. Data of 35 males (control group: 7 males, intervention group: 28 males, mean age: 47.2 ± 7.9 years) were collected and analyzed. Results We obtained an effective outcome by demonstrating that ongoing intake of a Japanese-style healthy lunch decreased blood pressure and serum lipids and increased plasma ghrelin levels. The results grew more pronounced as intake of Japanese-style healthy lunches increased in frequency. Conclusions This study presents new empirical data as a result of an original intervention program undertaken in Japan. A Japanese-style healthy lunch menu containing many vegetables consumed can help prevent and/or improve metabolic syndrome. PMID:24673894

  12. Mammalian NUMT insertion is non-random

    PubMed Central

    Tsuji, Junko; Frith, Martin C.; Tomii, Kentaro; Horton, Paul

    2012-01-01

    It is well known that remnants of partial or whole copies of mitochondrial DNA, known as Nuclear MiTochondrial sequences (NUMTs), are found in nuclear genomes. Since whole genome sequences have become available, many bioinformatics studies have identified putative NUMTs and from those attempted to infer the factors involved in NUMT creation. These studies conclude that NUMTs represent randomly chosen regions of the mitochondrial genome. There is less consensus regarding the nuclear insertion sites of NUMTs — previous studies have discussed the possible role of retrotransposons, but some recent ones have reported no correlation or even anti-correlation between NUMT sites and retrotransposons. These studies have generally defined NUMT sites using BLAST with default parameters. We analyze a redefined set of human NUMTs, computed with a carefully considered protocol. We discover that the inferred insertion points of NUMTs have a strong tendency to have high-predicted DNA curvature, occur in experimentally defined open chromatin regions and often occur immediately adjacent to A + T oligomers. We also show clear evidence that their flanking regions are indeed rich in retrotransposons. Finally we show that parts of the mitochondrial genome D-loop are under-represented as a source of NUMTs in primate evolution. PMID:22761406

  13. New Statistical Results on the Angular Distribution of Gamma-Ray Bursts

    SciTech Connect

    Balazs, Lajos G.; Horvath, Istvan; Vavrek, Roland

    2008-05-22

    We presented the results of several statistical tests of the randomness in the angular sky-distribution of gamma-ray bursts in BATSE Catalog. Thirteen different tests were presented based on Voronoi tesselation, Minimal spanning tree and Multifractal spectrum for five classes (short1, short2, intermediate, long1, long2) of gamma-ray bursts, separately. The long1 and long2 classes are distributed randomly. The intermediate subclass, in accordance with the earlier results of the authors, is distributed non-randomly. Concerning the short subclass earlier statistical tests also suggested some departure from the random distribution, but not on a high enough confidence level. The new tests presented in this article suggest also non-randomness here.

  14. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  15. Effectiveness of Kenya's Community Health Strategy in delivering community-based maternal and newborn health care in Busia County, Kenya: non-randomized pre-test post test study

    PubMed Central

    Wangalwa, Gilbert; Cudjoe, Bennett; Wamalwa, David; Machira, Yvonne; Ofware, Peter; Ndirangu, Meshack; Ilako, Festus

    2012-01-01

    Background Maternal mortality ratio and neonatal mortality rate trends in Kenya have remained unacceptably high in a decade. In 2007, the Ministry of Public Health and Sanitation adopted a community health strategy to reverse the poor health outcomes in order to meet Millennium Development Goals 4 and 5. It aims at strengthening community participation and its ability to take action towards health. The study aimed at evaluating the effectiveness of the strategy in improving maternal and neonatal health outcomes in Kenya. Methods Between 2008 and 2010, the African Medical and Research Foundation implemented a community-based maternal and newborn care intervention package in Busia County using the community health strategy approach. An interventional, non-randomized pre-test post test study design was used to evaluate change in essential maternal and neonatal care practices among mothers with children aged 0 - 23 months. Results There was statistically significant (p < 0.05) increase in attendance of at least four antenatal care visits (39% to 62%), deliveries by skilled birth attendants (31% to 57%), receiving intermittent preventive treatment (23% to 57%), testing for HIV during pregnancy (73% to 90%) and exclusive breastfeeding (20% to 52%). Conclusion The significant increase in essential maternal and neonatal care practices demonstrates that, community health strategy is an appropriate platform to deliver community based interventions. The findings will be used by actors in the child survival community to improve current approaches, policies and practice in maternal and neonatal care. PMID:23467438

  16. Impact of a Multifaceted and Clinically Integrated Training Program in Evidence-Based Practice on Knowledge, Skills, Beliefs and Behaviour among Clinical Instructors in Physiotherapy: A Non-Randomized Controlled Study

    PubMed Central

    Olsen, Nina Rydland; Bradley, Peter; Espehaug, Birgitte; Nortvedt, Monica Wammen; Lygren, Hildegunn; Frisk, Bente; Bjordal, Jan Magnus

    2015-01-01

    Background and Purpose Physiotherapists practicing at clinical placement sites assigned the role as clinical instructors (CIs), are responsible for supervising physiotherapy students. For CIs to role model evidence-based practice (EBP) they need EBP competence. The aim of this study was to assess the short and long term impact of a six-month multifaceted and clinically integrated training program in EBP on the knowledge, skills, beliefs and behaviour of CIs supervising physiotherapy students. Methods We invited 37 CIs to participate in this non-randomized controlled study. Three self-administered questionnaires were used pre- and post-intervention, and at six-month follow-up: 1) The Adapted Fresno test (AFT), 2) the EBP Belief Scale and 3) the EBP Implementation Scale. The analysis approach was linear regression modeling using Generalized Estimating Equations. Results In total, 29 CIs agreed to participate in the study: 14 were invited to participate in the intervention group and 15 were invited to participate in the control group. One in the intervention group and five in the control group were lost to follow-up. At follow-up, the group difference was statistically significant for the AFT (mean difference = 37, 95% CI (15.9 -58.1), p<0.001) and the EBP Beliefs scale (mean difference = 8.1, 95% CI (3.1 -13.2), p = 0.002), but not for the EBP Implementation scale (mean difference = 1.8. 95% CI (-4.5-8.1), p = 0.574). Comparing measurements over time, we found a statistically significant increase in mean scores related to all outcome measures for the intervention group only. Conclusions A multifaceted and clinically integrated training program in EBP was successful in improving EBP knowledge, skills and beliefs among CIs. Future studies need to ensure long-term EBP behaviour change, in addition to assessing CIs’ abilities to apply EBP knowledge and skills when supervising students. PMID:25894559

  17. Does school-based physical activity decrease overweight and obesity in children aged 6–9 years? A two-year non-randomized longitudinal intervention study in the Czech Republic

    PubMed Central

    2012-01-01

    Background Globally, efforts aimed at the prevention of childhood obesity have led to the implementation of a range of school-based interventions. This study assessed whether augmenting physical activity (PA) within the school setting resulted in increased daily PA and decreased overweight/obesity levels in 6-9-year-old children. Methods Across the first to third primary school years, PA of 84 girls and 92 boys was objectively monitored five times (each for seven successive days) using Yamax pedometer (step counts) and Caltrac accelerometer (activity energy expenditure AEE - kcal/kg per day). Four schools were selected to participate in the research (2 intervention, 2 controls), comprising intervention (43 girls, 45 boys) and control children (41 girls, 47 boys). The study was non-randomized and the intervention schools were selected on the basis of existing PA-conducive environment. Analyses of variance (ANOVA) for repeated measures examined the PA programme and gender effects on the step counts and AEE. Logistic regression (Enter method) determined the obesity and overweight occurrence prospect over the course of implementation of the PA intervention. Results There was a significant increase of school-based PA during schooldays in intervention children (from ≈ 1718 to ≈ 3247 steps per day; and from 2.1 to ≈ 3.6 Kcal/Kg per day) in comparison with the control children. Increased school-based PA of intervention children during schooldays contributed to them achieving >10,500 steps and >10.5 Kcal/Kg per school day across the 2 years of the study, and resulted in a stop of the decline in PA levels that is known to be associated with the increasing age of children. Increased school-based PA had also positive impact on leisure time PA of schooldays and on PA at weekends of intervention children. One year after the start of the PA intervention, the odds of being overweight or obese in the intervention children was almost three times lower than that of

  18. Evaluating the Effectiveness of an Antimicrobial Stewardship Program on Reducing the Incidence Rate of Healthcare-Associated Clostridium difficile Infection: A Non-Randomized, Stepped Wedge, Single-Site, Observational Study

    PubMed Central

    McArthur, Leslie

    2016-01-01

    Background The incidence rate of healthcare-associated Clostridium difficile infection (HA-CDI) is estimated at 1 in 100 patients. Antibiotic exposure is the most consistently reported risk factor for HA-CDI. Strategies to reduce the risk of HA-CDI have focused on reducing antibiotic utilization. Prospective audit and feedback is a commonly used antimicrobial stewardship intervention (ASi). The impact of this ASi on risk of HA-CDI is equivocal. This study examines the effectiveness of a prospective audit and feedback ASi on reducing the risk of HA-CDI. Methods Single-site, 339 bed community-hospital in Barrie, Ontario, Canada. Primary outcome is HA-CDI incidence rate. Daily prospective and audit ASi is the exposure variable. ASi implemented across 6 wards in a non-randomized, stepped wedge design. Criteria for ASi; any intravenous antibiotic use for ≥ 48 hrs, any oral fluoroquinolone or oral second generation cephalosporin use for ≥ 48 hrs, or any antimicrobial use for ≥ 5 days. HA-CDI cases and model covariates were aggregated by ward, year and month starting September 2008 and ending February 2016. Multi-level mixed effect negative binomial regression analysis was used to model the primary outcome, with intercept and slope coefficients for ward-level random effects estimated. Other covariates tested for inclusion in the final model were derived from previously published risk factors. Deviance residuals were used to assess the model’s goodness-of-fit. Findings The dataset included 486 observation periods, of which 350 were control periods and 136 were intervention periods. After accounting for all other model covariates, the estimated overall ASi incidence rate ratio (IRR) was 0.48 (95% 0.30, 0.79). The ASi effect was independent of antimicrobial utilization. The ASi did not seem to reduce the risk of Clostridium difficile infection on the surgery wards (IRR 0.87, 95% CI 0.45, 1.69) compared to the medicine wards (IRR 0.42, 95% CI 0.28, 0.63). The ward

  19. On The Distribution Of Angular Orbital Elements Of Near-earth Objects

    NASA Astrophysics Data System (ADS)

    JeongAhn, Youngmin; Malhotra, R.

    2012-05-01

    The longitude of ascending node Ω and the argument of periapsis ω are expected to be randomly distributed for near-Earth objects (NEOs). However, the distribution of these angles for the Apollo, Amor and Aten subclasses, considered separately, shows some striking non-random features. We explain how these features arise due to observational biases. The distribution of Ω has maxima near 0 and 180° and is affected by observational difficulty due to the galactic plane at the opposition and other seasonal effects. The ω distributions of Aten and Amor subclasses have minima at 90° and 270° while Apollos have minima at 0 and 180°. This is explained by the greater detectability of NEOs at close approach to Earth. The longitude of perihelion Ω+ω also has a strongly non-random distribution that may be owed to actual dynamical effects. Understanding the distribution of unobserved NEOs will help to improve planning for the next generation of NEO surveys. A better understanding of the intrinsic distribution of NEOs is important for estimating the impact hazard at Earth; it is also important for understanding the impact history of the Moon and the terrestrial planets.

  20. Fish depth distributions in the Lower Mississippi River

    USGS Publications Warehouse

    Killgore, K. J.; Miranda, Leandro E.

    2014-01-01

    A substantial body of literature exists about depth distribution of fish in oceans, lakes and reservoirs, but less is known about fish depth distribution in large rivers. Most of the emphasis on fish distributions in rivers has focused on longitudinal and latitudinal spatial distributions. Knowledge on depth distribution is necessary to understand species and community habitat needs. Considering this void, our goal was to identify patterns in fish benthic distribution along depth gradients in the Lower Mississippi River. Fish were collected over 14 years in depths down to 27 m. Fish exhibited non-random depth distributions that varied seasonally and according to species. Species richness was highest in shallow water, with about 50% of the 62 species detected no longer collected in water deeper than 8 m and about 75% no longer collected in water deeper than 12 m. Although richness was highest in shallow water, most species were not restricted to shallow water. Rather, most species used a wide range of depths. A weak depth zonation occurred, not as strong as that reported for deep oceans and lakes. Larger fish tended to occur in deeper water during the high-water period of an annual cycle, but no correlation was evident during the low-water period. The advent of landscape ecology has guided river research to search for spatial patterns along the length of the river and associated floodplains. Our results suggest that fish assemblages in large rivers are also structured vertically. 

  1. The Use of Control in Non-Randomized Designs.

    ERIC Educational Resources Information Center

    Halperin, Si; Jorgensen, Randall

    The concept of control is fundamental to comparative research. In research designs where randomization of observational units is not possible, control has been exercised statistically from a single covariate by a process of residualization. The alternative, known as subclassification on the propensity score, was developed primarily for…

  2. From non-random molecular structure to life and mind

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1989-01-01

    The evolutionary hierarchy molecular structure-->macromolecular structure-->protobiological structure-->biological structure-->biological functions has been traced by experiments. The sequence always moves through protein. Extension of the experiments traces the formation of nucleic acids instructed by proteins. The proteins themselves were, in this picture, instructed by the self-sequencing of precursor amino acids. While the sequence indicated explains the thread of the emergence of life, protein in cellular membrane also provides the only known material basis for the emergence of mind in the context of emergence of life.

  3. On Authentication with HMAC and Non-random Properties

    NASA Astrophysics Data System (ADS)

    Rechberger, Christian; Rijmen, Vincent

    MAC algorithms can provide cryptographically secure authentication services. One of the most popular algorithms in commercial applications is HMAC based on the hash functions MD5 or SHA-1. In the light of new collision search methods for members of the MD4 family including SHA-1, the security of HMAC based on these hash functions is reconsidered.

  4. Non-random walks in monkeys and humans

    PubMed Central

    Boyer, Denis; Crofoot, Margaret C.; Walsh, Peter D.

    2012-01-01

    Principles of self-organization play an increasingly central role in models of human activity. Notably, individual human displacements exhibit strongly recurrent patterns that are characterized by scaling laws and can be mechanistically modelled as self-attracting walks. Recurrence is not, however, unique to human displacements. Here we report that the mobility patterns of wild capuchin monkeys are not random walks, and they exhibit recurrence properties similar to those of cell phone users, suggesting spatial cognition mechanisms shared with humans. We also show that the highly uneven visitation patterns within monkey home ranges are not entirely self-generated but are forced by spatio-temporal habitat heterogeneities. If models of human mobility are to become useful tools for predictive purposes, they will need to consider the interaction between memory and environmental heterogeneities. PMID:22031731

  5. Distribute What?

    ERIC Educational Resources Information Center

    Brown, Wayne A.

    1989-01-01

    Distributed Data Processing, linking a central processing unit to remote computer sites, allows end users more control over their own destiny. Schools have distributed hardware and software but not talent. The primary goal of these staff experts should be to educate users as fully as they can. (MLW)

  6. Distributed Intelligence.

    ERIC Educational Resources Information Center

    McLagan, Patricia A.

    2003-01-01

    Distributed intelligence occurs when people in an organization take responsibility for creating innovations, solving problems, and making decisions. Organizations that have it excel in their markets and the global environment. (Author/JOW)

  7. Class Size, Class Composition, and the Distribution of Student Achievement

    ERIC Educational Resources Information Center

    Bosworth, Ryan

    2014-01-01

    Using richly detailed data on fourth- and fifth-grade students in the North Carolina public school system, I find evidence that students are assigned to classrooms in a non-random manner based on observable characteristics for a substantial portion of classrooms. Moreover, I find that this non-random assignment is statistically related to class…

  8. Distributed Leadership.

    ERIC Educational Resources Information Center

    Lashway, Larry

    2003-01-01

    School-reform efforts in recent years have stressed, and expanded, the leadership role of the principal. But in the view of many analysts, the task of transforming a school is too complex for one person to accomplish alone. Consequently, a new model of leadership is developing: distributed leadership. This Research Roundup summarizes five…

  9. Genomic distribution of simple sequence repeats in Brassica rapa.

    PubMed

    Hong, Chang Pyo; Piao, Zhong Yun; Kang, Tae Wook; Batley, Jacqueline; Yang, Tae-Jin; Hur, Yoon-Kang; Bhak, Jong; Park, Beom-Seok; Edwards, David; Lim, Yong Pyo

    2007-06-30

    Simple Sequence Repeats (SSRs) represent short tandem duplications found within all eukaryotic organisms. To examine the distribution of SSRs in the genome of Brassica rapa ssp. pekinensis, SSRs from different genomic regions representing 17.7 Mb of genomic sequence were surveyed. SSRs appear more abundant in non-coding regions (86.6%) than in coding regions (13.4%). Comparison of SSR densities in different genomic regions demonstrated that SSR density was greatest within the 5'-flanking regions of the predicted genes. The proportion of different repeat motifs varied between genomic regions, with trinucleotide SSRs more prevalent in predicted coding regions, reflecting the codon structure in these regions. SSRs were also preferentially associated with gene-rich regions, with peri-centromeric heterochromatin SSRs mostly associated with retrotransposons. These results indicate that the distribution of SSRs in the genome is non-random. Comparison of SSR abundance between B. rapa and the closely related species Arabidopsis thaliana suggests a greater abundance of SSRs in B. rapa, which may be due to the proposed genome triplication. Our results provide a comprehensive view of SSR genomic distribution and evolution in Brassica for comparison with the sequenced genomes of A. thaliana and Oryza sativa.

  10. Distributed Optimization

    NASA Technical Reports Server (NTRS)

    Macready, William; Wolpert, David

    2005-01-01

    We demonstrate a new framework for analyzing and controlling distributed systems, by solving constrained optimization problems with an algorithm based on that framework. The framework is ar. information-theoretic extension of conventional full-rationality game theory to allow bounded rational agents. The associated optimization algorithm is a game in which agents control the variables of the optimization problem. They do this by jointly minimizing a Lagrangian of (the probability distribution of) their joint state. The updating of the Lagrange parameters in that Lagrangian is a form of automated annealing, one that focuses the multi-agent system on the optimal pure strategy. We present computer experiments for the k-sat constraint satisfaction problem and for unconstrained minimization of NK functions.

  11. Distributed SLAM

    NASA Astrophysics Data System (ADS)

    Binns, Lewis A.; Valachis, Dimitris; Anderson, Sean; Gough, David W.; Nicholson, David; Greenway, Phil

    2002-07-01

    Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.

  12. Distributed Saturation

    NASA Technical Reports Server (NTRS)

    Chung, Ming-Ying; Ciardo, Gianfranco; Siminiceanu, Radu I.

    2007-01-01

    The Saturation algorithm for symbolic state-space generation, has been a recent break-through in the exhaustive veri cation of complex systems, in particular globally-asyn- chronous/locally-synchronous systems. The algorithm uses a very compact Multiway Decision Diagram (MDD) encoding for states and the fastest symbolic exploration algo- rithm to date. The distributed version of Saturation uses the overall memory available on a network of workstations (NOW) to efficiently spread the memory load during the highly irregular exploration. A crucial factor in limiting the memory consumption during the symbolic state-space generation is the ability to perform garbage collection to free up the memory occupied by dead nodes. However, garbage collection over a NOW requires a nontrivial communication overhead. In addition, operation cache policies become critical while analyzing large-scale systems using the symbolic approach. In this technical report, we develop a garbage collection scheme and several operation cache policies to help on solving extremely complex systems. Experiments show that our schemes improve the performance of the original distributed implementation, SmArTNow, in terms of time and memory efficiency.

  13. Fish species and community distributions as proxies for sea-floor habitat distributions: the Stellwagen Bank National Marine Sanctuary example (northwest Atlantic, Gulf Of Maine)

    USGS Publications Warehouse

    Auster, Peter J.; Joy, Kevin; Valentine, Page C.

    2001-01-01

    Defining the habitats of fishes and associated fauna on outer continental shelves is problematic given the paucity of data on the actual types and distributions of seafloor habitats. However many regions have good data on the distributions of fishes from resource surveys or catch statistics because of the economic importance of the fisheries. Fish distribution data (species or communities) have been used as a proxy for the distribution of habitats to develop precautionary conservation strategies for habitat protection (e.g., marine protected areas, fishing gear restrictions). In this study we assessed the relationships between the distributions of fish communities and species derived from trawl survey data with the spatial distribution of sediment types determined by sampling and acoustic reflectance derived from multibeam sonar surveys in Stellwagen Bank National Marine Sanctuary. Fish communities were correlated with reflectance values but all communities did not occur in unique sediment types. This suggests that use of community distributions as proxies for habitats should include the caveat that a greater number of communities within an area could indicate a greater range of habitat types. Single species distributions showed relationships between abundance and reflectance values. Trawl catches with low abundances had wide variations in reflectance values while those with high abundances had narrower ranges indicating habitat affinities. Significant non-random frequency-dependent relationships were observed for 17 of 20 species although only 12 of 20 species had significant relationships based on rank correlation. These results suggest that species distributions based on trawl survey data can be used as proxies for the distribution of seafloor habitats. Species with known habitat associations can be used to infer habitat requirements of co-occurring species and can be used to identify a range of habitat types.

  14. Fish species and community distributions as proxies for seafloor habitat distributions: The Stellwagen Bank National Marine Sanctuary example (Northwest Atlantic, Gulf of Maine)

    USGS Publications Warehouse

    Auster, P.J.; Joy, K.; Valentine, P.C.

    2001-01-01

    Defining the habitats of fishes and associated fauna on outer continental shelves is problematic given the paucity of data on the actual types and distributions of seafloor habitats. However many regions have good data on the distributions of fishes from resource surveys or catch statistics because of the economic importance of the fisheries. Fish distribution data (species or communities) have been used as a proxy for the distribution of habitats to develop precautionary conservation strategies for habitat protection (e.g., marine protected areas, fishing gear restrictions). In this study we assessed the relationships between the distributions of fish communities and species derived from trawl survey data with the spatial distribution of sediment types determined by sampling and acoustic reflectance derived from multibeam sonar surveys in Stellwagen Bank National Marine Sanctuary. Fish communities were correlated with reflectance values but all communities did not occur in unique sediment types. This suggests that use of community distributions as proxies for habitats should include the caveat that a greater number of communities within an area could indicate a greater range of habitat types. Single species distributions showed relationships between abundance and reflectance values. Trawl catches with low abundances had wide variations in reflectance values while those with high abundances had narrower ranges indicating habitat affinities. Significant non-random frequency-dependent relationships were observed for 17 of 20 species although only 12 of 20 species had significant relationships based on rank correlation. These results suggest that species distributions based on trawl survey data can be used as proxies for the distribution of seafloor habitats. Species with known habitat associations can be used to infer habitat requirements of co-occurring species and can be used to identify a range of habitat types.

  15. Effective Suppression of Pathological Synchronization in Cortical Networks by Highly Heterogeneous Distribution of Inhibitory Connections

    PubMed Central

    Kada, Hisashi; Teramae, Jun-Nosuke; Tokuda, Isao T.

    2016-01-01

    Even without external random input, cortical networks in vivo sustain asynchronous irregular firing with low firing rate. In addition to detailed balance between excitatory and inhibitory activities, recent theoretical studies have revealed that another feature commonly observed in cortical networks, i.e., long-tailed distribution of excitatory synapses implying coexistence of many weak and a few extremely strong excitatory synapses, plays an essential role in realizing the self-sustained activity in recurrent networks of biologically plausible spiking neurons. The previous studies, however, have not considered highly non-random features of the synaptic connectivity, namely, bidirectional connections between cortical neurons are more common than expected by chance and strengths of synapses are positively correlated between pre- and postsynaptic neurons. The positive correlation of synaptic connections may destabilize asynchronous activity of networks with the long-tailed synaptic distribution and induce pathological synchronized firing among neurons. It remains unclear how the cortical network avoids such pathological synchronization. Here, we demonstrate that introduction of the correlated connections indeed gives rise to synchronized firings in a cortical network model with the long-tailed distribution. By using a simplified feed-forward network model of spiking neurons, we clarify the underlying mechanism of the synchronization. We then show that the synchronization can be efficiently suppressed by highly heterogeneous distribution, typically a lognormal distribution, of inhibitory-to-excitatory connection strengths in a recurrent network model of cortical neurons. PMID:27803659

  16. Progress in characterizing submonolayer island growth: Capture-zone distributions, growth exponents, & hot precursors

    NASA Astrophysics Data System (ADS)

    Einstein, Theodore L.; Pimpinelli, Alberto; González, Diego Luis; Morales-Cifuentes, Josue R.

    2015-09-01

    In studies of epitaxial growth, analysis of the distribution of the areas of capture zones (i.e. proximity polygons or Voronoi tessellations with respect to island centers) is often the best way to extract the critical nucleus size i. For non-random nucleation the normalized areas s of these Voronoi cells are well described by the generalized Wigner distribution (GWD) Pβ(s) = asβ exp(-bs2), particularly in the central region 0.5 < s < 2 where data are least noisy. Extensive Monte Carlo simulations reveal inadequacies of our earlier mean field analysis, suggesting β = i + 2 for diffusion-limited aggregation (DLA). Since simulations generate orders of magnitude more data than experiments, they permit close examination of the tails of the distribution, which differ from the simple GWD form. One refinement is based on a fragmentation model. We also compare island-size distributions. We compare analysis by island-size distribution and by scaling of island density with flux. Modifications appear for attach-limited aggregation (ALA). We focus on the experimental system para-hexaphenyl on amorphous mica, comparing the results of the three analysis techniques and reconciling their results via a novel model of hot precursors based on rate equations, pointing out the existence of intermediate scaling regimes between DLA and ALA.

  17. Product Distributions for Distributed Optimization. Chapter 1

    NASA Technical Reports Server (NTRS)

    Bieniawski, Stefan R.; Wolpert, David H.

    2004-01-01

    With connections to bounded rational game theory, information theory and statistical mechanics, Product Distribution (PD) theory provides a new framework for performing distributed optimization. Furthermore, PD theory extends and formalizes Collective Intelligence, thus connecting distributed optimization to distributed Reinforcement Learning (FU). This paper provides an overview of PD theory and details an algorithm for performing optimization derived from it. The approach is demonstrated on two unconstrained optimization problems, one with discrete variables and one with continuous variables. To highlight the connections between PD theory and distributed FU, the results are compared with those obtained using distributed reinforcement learning inspired optimization approaches. The inter-relationship of the techniques is discussed.

  18. Shallow stratigraphic control on pockmark distribution in north temperate estuaries

    USGS Publications Warehouse

    Brothers, Laura L.; Kelley, Joseph T.; Belknap, Daniel F.; Barnhardt, Walter A.; Andrews, Brian D.; Legere, Christine; Hughes-Clarke, John E.

    2012-01-01

    Pockmark fields occur throughout northern North American temperate estuaries despite the absence of extensive thermogenic hydrocarbon deposits typically associated with pockmarks. In such settings, the origins of the gas and triggering mechanism(s) responsible for pockmark formation are not obvious. Nor is it known why pockmarks proliferate in this region but do not occur south of the glacial terminus in eastern North America. This paper tests two hypotheses addressing these knowledge gaps: 1) the region's unique sea-level history provided a terrestrial deposit that sourced the gas responsible for pockmark formation; and 2) the region's physiography controls pockmarks distribution. This study integrates over 2500 km of high-resolution swath bathymetry, Chirp seismic reflection profiles and vibracore data acquired in three estuarine pockmark fields in the Gulf of Maine and Bay of Fundy. Vibracores sampled a hydric paleosol lacking the organic-rich upper horizons, indicating that an organic-rich terrestrial deposit was eroded prior to pockmark formation. This observation suggests that the gas, which is presumably responsible for the formation of the pockmarks, originated in Holocene estuarine sediments (loss on ignition 3.5–10%), not terrestrial deposits that were subsequently drowned and buried by mud. The 7470 pockmarks identified in this study are non-randomly clustered. Pockmark size and distribution relate to Holocene sediment thickness (r2 = 0.60), basin morphology and glacial deposits. The irregular underlying topography that dictates Holocene sediment thickness may ultimately play a more important role in temperate estuarine pockmark distribution than drowned terrestrial deposits. These results give insight into the conditions necessary for pockmark formation in nearshore coastal environments.

  19. Distributive Education. Physical Distribution. Instructor's Curriculum.

    ERIC Educational Resources Information Center

    Missouri Univ., Columbia. Instructional Materials Lab.

    This distributive education performance-based instructional unit is designed to help students understand the system of physical distribution and to act as an aid to guiding students in preparing for future careers in the transportation industry dealing with the retail, wholesale, and service occupations. (Physical distribution involves the moving…

  20. On S.N. Bernstein's derivation of Mendel's Law and 'rediscovery' of the Hardy-Weinberg distribution.

    PubMed

    Stark, Alan; Seneta, Eugene

    2012-04-01

    Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions) from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen's monograph influenced Bernstein is discussed.

  1. On S.N. Bernstein's derivation of Mendel's Law and 'rediscovery' of the Hardy-Weinberg distribution.

    PubMed

    Stark, Alan; Seneta, Eugene

    2012-04-01

    Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions) from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen's monograph influenced Bernstein is discussed. PMID:22888285

  2. A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses.

    PubMed

    Riday, Heathcliffe; Smith, Mark A; Peel, Michael D

    2015-09-01

    A simple Weibull distribution based empirical model that predicts pollen-parent fecundity distributions based on polycross size alone has been developed in outbred forage legume species for incorporation into quantitative genetic theory. Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact, although a large body of empirical work shows that this is often not the case in nature. Models have been developed to explain many non-random mating phenomena. This paper measured pollen-parent fecundity distributions among outbred perennial forage legume species [autotetraploid alfalfa (Medicago sativa L.), autohexaploid kura clover (Trifolium ambiguum M. Bieb.), and diploid red clover (Trifolium pratense L.)] in ten polycrosses ranging in size (N) from 9 to 94 pollinated with bee pollinators [Bumble Bees (Bombus impatiens Cr.) and leafcutter bees (Megachile rotundata F.)]. A Weibull distribution best fit the observed pollen-parent fecundity distributions. After standardizing data among the 10 polycrosses, a single Weibull distribution-based model was obtained with an R (2) of 0.978. The model is able to predict pollen-parent fecundity distributions based on polycross size alone. The model predicts that the effective polycross size will be approximately 9 % smaller than under random mating (i.e., N e/N ~ 0.91). The model is simple and can easily be incorporated into other models or simulations requiring a pollen-parent fecundity distribution. Further work is needed to determine how widely applicable the model is. PMID:26105686

  3. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  4. Amyloplasts that sediment in protonemata of the moss Ceratodon purpureus are nonrandomly distributed in microgravity

    NASA Technical Reports Server (NTRS)

    Kern, V. D.; Smith, J. D.; Schwuchow, J. M.; Sack, F. D.

    2001-01-01

    Little is known about whether or how plant cells regulate the position of heavy organelles that sediment toward gravity. Dark-grown protonemata of the moss Ceratodon purpureus displays a complex plastid zonation in that only some amyloplasts sediment along the length of the tip cell. If gravity is the major force determining the position of amyloplasts that sediment, then these plastids should be randomly distributed in space. Instead, amyloplasts were clustered in the subapical region in microgravity. Cells rotated on a clinostat on earth had a roughly similar non-random plastid distribution. Subapical clusters were also found in ground controls that were inverted and kept stationary, but the distribution profile differed considerably due to amyloplast sedimentation. These findings indicate the existence of as yet unknown endogenous forces and mechanisms that influence amyloplast position and that are normally masked in stationary cells grown on earth. It is hypothesized that a microtubule-based mechanism normally compensates for g-induced drag while still allowing for regulated amyloplast sedimentation.

  5. Amyloplasts That Sediment in Protonemata of the Moss Ceratodon purpureus Are Nonrandomly Distributed in Microgravity1

    PubMed Central

    Kern, Volker D.; Smith, Jeffrey D.; Schwuchow, Jochen M.; Sack, Fred D.

    2001-01-01

    Little is known about whether or how plant cells regulate the position of heavy organelles that sediment toward gravity. Dark-grown protonemata of the moss Ceratodon purpureus displays a complex plastid zonation in that only some amyloplasts sediment along the length of the tip cell. If gravity is the major force determining the position of amyloplasts that sediment, then these plastids should be randomly distributed in space. Instead, amyloplasts were clustered in the subapical region in microgravity. Cells rotated on a clinostat on earth had a roughly similar non-random plastid distribution. Subapical clusters were also found in ground controls that were inverted and kept stationary, but the distribution profile differed considerably due to amyloplast sedimentation. These findings indicate the existence of as yet unknown endogenous forces and mechanisms that influence amyloplast position and that are normally masked in stationary cells grown on earth. It is hypothesized that a microtubule-based mechanism normally compensates for g-induced drag while still allowing for regulated amyloplast sedimentation. PMID:11299388

  6. Distribution and Habitat Specificity of Potentially-Toxic Microcystis across Climate, Land, and Water Use Gradients.

    PubMed

    Marmen, Sophi; Aharonovich, Dikla; Grossowicz, Michal; Blank, Lior; Yacobi, Yosef Z; Sher, Daniel J

    2016-01-01

    Toxic cyanobacterial blooms are a growing threat to freshwater bodies worldwide. In order for a toxic bloom to occur, a population of cells with the genetic capacity to produce toxins must be present together with the appropriate environmental conditions. In this study, we investigated the distribution patterns and phylogeny of potentially-toxic Microcystis (indicated by the presence and/or phylogeny of the mcyD and mcyA genes). Samples were collected from the water column of almost 60 water bodies across widely differing gradients of environmental conditions and land use in Israel. Potentially, toxic populations were common but not ubiquitous, detected in ~65% of the studied sites. Local environmental factors, including phosphorus and ammonia concentrations and pH, as well as regional conditions such as the distance from built areas and nature reserves, were correlated with the distribution of the mcyD gene. A specific phylogenetic clade of Microcystis, defined using the sequence of the mcyA gene, was preferentially associated with aquaculture facilities but not irrigation reservoirs. Our results reveal important environmental, geospatial, and land use parameters affecting the geographic distribution of toxinogenic Microcystis, suggesting non-random dispersal of these globally abundant toxic cyanobacteria.

  7. Distribution and Habitat Specificity of Potentially-Toxic Microcystis across Climate, Land, and Water Use Gradients

    PubMed Central

    Marmen, Sophi; Aharonovich, Dikla; Grossowicz, Michal; Blank, Lior; Yacobi, Yosef Z.; Sher, Daniel J.

    2016-01-01

    Toxic cyanobacterial blooms are a growing threat to freshwater bodies worldwide. In order for a toxic bloom to occur, a population of cells with the genetic capacity to produce toxins must be present together with the appropriate environmental conditions. In this study, we investigated the distribution patterns and phylogeny of potentially-toxic Microcystis (indicated by the presence and/or phylogeny of the mcyD and mcyA genes). Samples were collected from the water column of almost 60 water bodies across widely differing gradients of environmental conditions and land use in Israel. Potentially, toxic populations were common but not ubiquitous, detected in ~65% of the studied sites. Local environmental factors, including phosphorus and ammonia concentrations and pH, as well as regional conditions such as the distance from built areas and nature reserves, were correlated with the distribution of the mcyD gene. A specific phylogenetic clade of Microcystis, defined using the sequence of the mcyA gene, was preferentially associated with aquaculture facilities but not irrigation reservoirs. Our results reveal important environmental, geospatial, and land use parameters affecting the geographic distribution of toxinogenic Microcystis, suggesting non-random dispersal of these globally abundant toxic cyanobacteria. PMID:27014200

  8. Frequency-Rank Distributions

    ERIC Educational Resources Information Center

    Brookes, Bertram C.; Griffiths, Jose M.

    1978-01-01

    Frequency, rank, and frequency rank distributions are defined. Extensive discussion on several aspects of frequency rank distributions includes the Poisson process as a means of exploring the stability of ranks; the correlation of frequency rank distributions; and the transfer coefficient, a new measure in frequency rank distribution. (MBR)

  9. Annual Coal Distribution

    EIA Publications

    2016-01-01

    The Annual Coal Distribution Report (ACDR) provides detailed information on domestic coal distribution by origin state, destination state, consumer category, and method of transportation. Also provided is a summary of foreign coal distribution by coal-producing state. All data for the report year are final and this report supersedes all data in the quarterly distribution reports.

  10. Flowers as islands: spatial distribution of nectar-inhabiting microfungi among plants of Mimulus aurantiacus, a hummingbird-pollinated shrub

    PubMed Central

    Belisle, Melinda; Peay, Kabir G.; Fukami, Tadashi

    2014-01-01

    Microfungi inhabiting floral nectar offer unique opportunities for the study of microbial distribution and the role that dispersal limitation may play in generating distribution patterns. Flowers are well-replicated habitat islands, among which the microbes disperse via pollinators. This metapopulation system allows for investigation of microbial distribution at multiple spatial scales. We examined the distribution of the yeast, Metschnikowia reukaufii, and other fungal species found in the floral nectar of the sticky monkey flower, Mimulus aurantiacus, a hummingbird-pollinated shrub, at a California site. We found that the frequency of nectar-inhabiting microfungi on a given host plant was not significantly correlated with light availability, nectar volume or the percent cover of M. aurantiacus around the plant, but was significantly correlated with the location of the host plant and loosely correlated with the density of flowers on the plant. These results suggest that dispersal limitation caused by spatially non-random foraging by pollinators may be a primary factor driving the observed distribution pattern. PMID:22080257

  11. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2012-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold distribution procedure. The fold distribution provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of change in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Distribution, Proposal 12778, as Cycle 19.

  12. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2013-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold distribution procedure. The fold distribution provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of change in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Distribution, Proposal 13149, as Cycle 20.

  13. Distributed Data Management and Distributed File Systems

    NASA Astrophysics Data System (ADS)

    Girone, Maria

    2015-12-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  14. Exponentiated power Lindley distribution

    PubMed Central

    Ashour, Samir K.; Eltehiwy, Mahmoud A.

    2014-01-01

    A new generalization of the Lindley distribution is recently proposed by Ghitany et al. [1], called as the power Lindley distribution. Another generalization of the Lindley distribution was introduced by Nadarajah et al. [2], named as the generalized Lindley distribution. This paper proposes a more generalization of the Lindley distribution which generalizes the two. We refer to this new generalization as the exponentiated power Lindley distribution. The new distribution is important since it contains as special sub-models some widely well-known distributions in addition to the above two models, such as the Lindley distribution among many others. It also provides more flexibility to analyze complex real data sets. We study some statistical properties for the new distribution. We discuss maximum likelihood estimation of the distribution parameters. Least square estimation is used to evaluate the parameters. Three algorithms are proposed for generating random data from the proposed distribution. An application of the model to a real data set is analyzed using the new distribution, which shows that the exponentiated power Lindley distribution can be used quite effectively in analyzing real lifetime data. PMID:26644927

  15. Distribution of 45S rDNA sites in chromosomes of plants: Structural and evolutionary implications

    PubMed Central

    2012-01-01

    Background 45S rDNA sites are the most widely documented chromosomal regions in eukaryotes. The analysis of the distribution of these sites along the chromosome in several genera has suggested some bias in their distribution. In order to evaluate if these loci are in fact non-randomly distributed and what is the influence of some chromosomal and karyotypic features on the distribution of these sites, a database was built with the position and number of 45S rDNA sites obtained by FISH together with other karyotypic data from 846 plant species. Results In angiosperms the most frequent numbers of sites per diploid karyotype were two and four, suggesting that in spite of the wide dispersion capacity of these sequences the number of rDNA sites tends to be restricted. The sites showed a preferential distribution on the short arms, mainly in the terminal regions. Curiously, these sites were frequently found on the short arms of acrocentric chromosomes where they usually occupy the whole arm. The trend to occupy the terminal region is especially evident in holokinetic chromosomes, where all of them were terminally located. In polyploids there is a trend towards reduction in the number of sites per monoploid complement. In gymnosperms, however, the distribution of rDNA sites varied strongly among the sampled families. Conclusions The location of 45S rDNA sites do not vary randomly, occurring preferentially on the short arm and in the terminal region of chromosomes in angiosperms. The meaning of this preferential location is not known, but some hypotheses are considered and the observed trends are discussed. PMID:23181612

  16. Doubly Distributed Transactions

    2014-08-25

    Doubly Distributed Transactions (D2T) offers a technique for managing operations from a set of parallel clients with a collection of distributed services. It detects and manages faults. Example code with a test harness is also provided

  17. Linking the distribution of microbial deposits from the Great Salt Lake (Utah, USA) to tectonic and climatic processes

    NASA Astrophysics Data System (ADS)

    Bouton, Anthony; Vennin, Emmanuelle; Boulle, Julien; Pace, Aurélie; Bourillot, Raphaël; Thomazo, Christophe; Brayard, Arnaud; Désaubliaux, Guy; Goslar, Tomasz; Yokoyama, Yusuke; Dupraz, Christophe; Visscher, Pieter T.

    2016-10-01

    The Great Salt Lake is a modern hypersaline lake, in which an extended modern and ancient microbial sedimentary system has developed. Detailed mapping based on aerial images and field observations can be used to identify non-random distribution patterns of microbial deposits, such as paleoshorelines associated with extensive polygons or fault-parallel alignments. Although it has been inferred that climatic changes controlling the lake level fluctuations explain the distribution of paleoshorelines and polygons, straight microbial deposit alignments may underline a normal fault system parallel to the Wasatch Front. This study is based on observations over a decimetre to kilometre spatial range, resulting in an integrated conceptual model for the controls on the distribution of the microbial deposits. The morphology, size and distribution of these deposits result mainly from environmental changes (i.e. seasonal to long-term water level fluctuations, particular geomorphological heritage, fault-induced processes, groundwater seepage) and have the potential to bring further insights into the reconstruction of paleoenvironments and paleoclimatic changes through time. New radiocarbon ages obtained on each microbial macrofabric described in this study improve the chronological framework and question the lake level variations that are commonly assumed.

  18. Proximity within interphase chromosome contributes to the breakpoint distribution in radiation-induced intrachromosomal exchanges

    NASA Astrophysics Data System (ADS)

    Zhang, Ye; Uhlemeyer, Jimmy; Hada, Megumi; Asaithamby, A.; Chen, David J.; Wu, Honglu

    2014-07-01

    Previously, we reported that breaks involved in chromosome aberrations were clustered in several regions of chromosome 3 in human mammary epithelial cells after exposures to either low- or high-LET radiation. In particular, breaks in certain regions of the chromosome tended to rejoin with each other to form an intrachromosome exchange event. This study tests the hypothesis that proximity within a single chromosome in interphase cell nuclei contributes to the distribution of radiation-induced chromosome breaks. Chromosome 3 in G1 human mammary epithelial cells was hybridized with the multicolor banding in situ hybridization (mBAND) probes that distinguish the chromosome in six differently colored regions, and the location of these regions was measured with a laser confocal microscope. Results of the study indicated that, on a multi-mega base pair scale of the DNA, the arrangement of chromatin was non-random. Both telomere regions tended to be located towards the exterior of the chromosome domain, whereas the centromere region towards the interior. In addition, the interior of the chromosome domain was preferentially occupied by the p-arm of the chromatin, which is consistent with our previous finding of intrachromosome exchanges involving breaks on the p-arm and in the centromere region of chromosome 3. Other factors, such as the fragile sites in the 3p21 band and gene regulation, may also contribute to the breakpoint distribution in radiation-induced chromosome aberrations.

  19. Spatial distribution of psychotic disorders in an urban area of France: an ecological study

    PubMed Central

    Pignon, Baptiste; Schürhoff, Franck; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Saba, Ghassen; Leboyer, Marion; Kirkbride, James B.; Szöke, Andrei

    2016-01-01

    Previous analyses of neighbourhood variations of non-affective psychotic disorders (NAPD) have focused mainly on incidence. However, prevalence studies provide important insights on factors associated with disease evolution as well as for healthcare resource allocation. This study aimed to investigate the distribution of prevalent NAPD cases in an urban area in France. The number of cases in each neighbourhood was modelled as a function of potential confounders and ecological variables, namely: migrant density, economic deprivation and social fragmentation. This was modelled using statistical models of increasing complexity: frequentist models (using Poisson and negative binomial regressions), and several Bayesian models. For each model, assumptions validity were checked and compared as to how this fitted to the data, in order to test for possible spatial variation in prevalence. Data showed significant overdispersion (invalidating the Poisson regression model) and residual autocorrelation (suggesting the need to use Bayesian models). The best Bayesian model was Leroux’s model (i.e. a model with both strong correlation between neighbouring areas and weaker correlation between areas further apart), with economic deprivation as an explanatory variable (OR = 1.13, 95% CI [1.02–1.25]). In comparison with frequentist methods, the Bayesian model showed a better fit. The number of cases showed non-random spatial distribution and was linked to economic deprivation. PMID:27189529

  20. Distributed Learning Metadata Standards

    ERIC Educational Resources Information Center

    McClelland, Marilyn

    2004-01-01

    Significant economies can be achieved in distributed learning systems architected with a focus on interoperability and reuse. The key building blocks of an efficient distributed learning architecture are the use of standards and XML technologies. The goal of plug and play capability among various components of a distributed learning system…

  1. Probability distributions for magnetotellurics

    SciTech Connect

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  2. Distribution of Chromosome Breakpoints in Human Epithelial Cells Exposed to Low- and High-LET Radiations

    NASA Technical Reports Server (NTRS)

    Hada, Megumi; Cucinotta, Francis; Wu, Honglu

    2009-01-01

    The advantage of the multicolor banding in situ hybridization (mBAND) technique is not only its ability to identify simultaneously both inter- and intrachromosome exchanges, but also the ability to measure the breakpoint location along the length of the chromosome in a precision that is unmatched with other traditional banding techniques. Breakpoints on specific regions of a chromosome have been known to associate with specific cancers. The breakpoint distribution in cells after low- and high-LET radiation exposures will also provide the data for biophysical modeling of the chromatin structure, as well as the data for the modeling the formation of radiation-induced chromosome aberrations. In a series of experiments, we studied low- and high-LET radiation-induced chromosome aberrations using the mBAND technique with chromosome 3 painted in 23 different colored bands. Human epithelial cells (CH1 84B5F5/M10) were exposed in vitro to Cs- 137 rays at both low and high dose rates, secondary neutrons with a broad energy spectrum at a low dose rate and 600 MeV/u Fe ions at a high dose rate. The data of both inter- and intrachromosome aberrations involving the painted chromosome have been reported previously. Here we present data of the location of the chromosome breaks along the length of chromosome 3 in the cells after exposures to each of the four radiation scenarios. In comparison to the expected breakpoint distribution based on the length of the bands, the observed distribution appeared to be non-random for both the low- and high-LET radiations. In particular, hot spots towards both ends of the chromosome were found after low-LET irradiations of either low or high dose rates. For both high-LET radiation types (Fe ions and neutrons), the breakpoint distributions were similar, and were much smoother than that for low-LET radiation. The dependence of the breakpoint distribution on the radiation quality requires further investigations.

  3. FRIB cryogenic distribution system

    SciTech Connect

    Ganni, Venkatarao; Dixon, Kelly D.; Laverdure, Nathaniel A.; Knudsen, Peter N.; Arenius, Dana M.; Barrios, Matthew N.; Jones, S.; Johnson, M.; Casagrande, Fabio

    2014-01-01

    The Michigan State University Facility for Rare Isotope Beams (MSU-FRIB) helium distribution system has been revised to include bayonet/warm valve type disconnects between each cryomodule and the transfer line distribution system, similar to the Thomas Jefferson National Accelerator Facility (JLab) and the Spallation Neutron Source (SNS) cryogenic distribution systems. The heat loads at various temperature levels and some of the features in the design of the distribution system are outlined. The present status, the plans for fabrication, and the procurement approach for the helium distribution system are also included.

  4. FRIB cryogenic distribution system

    SciTech Connect

    Ganni, V.; Dixon, K.; Laverdure, N.; Knudsen, P.; Arenius, D.; Barrios, M.; Jones, S.; Johnson, M.; Casagrande, F.

    2014-01-29

    The Michigan State University Facility for Rare Isotope Beams (MSU-FRIB) helium distribution system has been revised to include bayonet/warm valve type disconnects between each cryomodule and the transfer line distribution system, similar to the Thomas Jefferson National Accelerator Facility (JLab) and the Spallation Neutron Source (SNS) cryogenic distribution systems. The heat loads at various temperature levels and some of the features in the design of the distribution system are outlined. The present status, the plans for fabrication, and the procurement approach for the helium distribution system are also included.

  5. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  6. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  7. Distributed generation systems model

    SciTech Connect

    Barklund, C.R.

    1994-12-31

    A slide presentation is given on a distributed generation systems model developed at the Idaho National Engineering Laboratory, and its application to a situation within the Idaho Power Company`s service territory. The objectives of the work were to develop a screening model for distributed generation alternatives, to develop a better understanding of distributed generation as a utility resource, and to further INEL`s understanding of utility concerns in implementing technological change.

  8. Cooling water distribution system

    DOEpatents

    Orr, Richard

    1994-01-01

    A passive containment cooling system for a nuclear reactor containment vessel. Disclosed is a cooling water distribution system for introducing cooling water by gravity uniformly over the outer surface of a steel containment vessel using an interconnected series of radial guide elements, a plurality of circumferential collector elements and collector boxes to collect and feed the cooling water into distribution channels extending along the curved surface of the steel containment vessel. The cooling water is uniformly distributed over the curved surface by a plurality of weirs in the distribution channels.

  9. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  10. Testing Nested Distributions

    NASA Astrophysics Data System (ADS)

    Economou, P.

    2010-09-01

    A number of criteria, test statistics and diagnostic plots have been developed in order to test the adapted distribution assumption. Usually, a simpler distribution is tested against a more complicated one (by adding an extra parameter), which include the first distribution as a special case (nested distributions). A characteristic example of such cases is the Burr XII distribution which can be obtained under the proportional hazards frailty model by assuming a Weibull baseline function and a Gamma frailty distribution with mean frailty equal to 1 and variance equal to θ. In this work, two new easy to construct and to interpret tests, a diagnostic plot and an asymptotic test, are presented in order to test nested distributions. The asymptotic test is based on the approximation of the difference of the two estimated nested distribution functions using the first two terms of the Taylor's expansion while the diagnostic plot is constructed using the exact difference of the two fitted distribution functions. Simulation results, using data sets with and without censored observations, demonstrate that the proposed tests perform, in most of the cases, better than other test statistics such as the LR and the Wald.

  11. Number and spatial distribution of nuclei in the muscle fibres of normal mice studied in vivo

    PubMed Central

    Bruusgaard, J C; Liestøl, K; Ekmark, M; Kollstad, K; Gundersen, K

    2003-01-01

    We present here a new technique with which to visualize nuclei in living muscle fibres in the intact animal, involving injection of labelled DNA into single cells. This approach allowed us to determine the position of all of nuclei within a sarcolemma without labelling satellite cells. In contrast to what has been reported in tissue culture, we found that the nuclei were immobile, even when observed over several days. Nucleic density was uniform along the fibre except for the endplate and some myotendinous junctions, where the density was higher. The perijunctional region had the same number of nuclei as the rest of the fibre. In the extensor digitorum longus (EDL) muscle, the extrajunctional nuclei were elongated and precisely aligned to the long axis of the fibre. In the soleus, the nuclei were rounder and not well aligned. When comparing small and large fibres in the soleus, the number of nuclei varied approximately in proportion to cytoplasmic volume, while in the EDL the number was proportional to surface area. Statistical analysis revealed that the nuclei were not randomly distributed in either the EDL or the soleus. For each fibre, actual distributions were compared with computer simulations in which nuclei were assumed to repel each other, which optimizes the distribution of nuclei with respect to minimizing transport distances. The simulated patterns were regular, with clear row-like structures when the density of nuclei was low. The non-random and often row-like distribution of nuclei observed in muscle fibres may thus reflect regulatory mechanisms whereby nuclei repel each other in order to minimize transport distances. PMID:12813146

  12. Asymmetry and non-random orientation of the inflight effective beam pattern in the WMAP data

    SciTech Connect

    Chiang, Lung-Yih

    2014-04-20

    Tentative evidence for statistical anisotropy in the Wilkinson Microwave Anisotropy Probe data was alleged to be due to 'insufficient handling of beam asymmetries'. In this paper, we investigate this issue and develop a method to estimate the shape of the inflight effective beam, particularly the asymmetry and azimuthal orientation. We divide the whole map into square patches and exploit the information in the Fourier space. For patches containing bright extragalactic point sources, we can directly estimate their shapes, from which the inflight effective beam can be estimated. For those without, we estimate the pattern from iso-power contours in two-dimensional Fourier space. We show that the inflight effective beam convolving the signal is indeed non-symmetric for most of the sky, and it is not randomly oriented. Around the ecliptic poles, however, the asymmetry is smaller due to the averaging effect from different orientations of the beam from the scan strategy. The orientations of the effective beam with significant asymmetry are parallel to the lines of ecliptic longitude. In the foreground-cleaned Internal Linear Combination map, however, the systematics caused by beam effect is significantly lessened.

  13. Non-random pairing in American kestrels: mate choice versus intra-sexual competition

    USGS Publications Warehouse

    Bortolotti, Gary R.; Iko, William M.

    1992-01-01

    Natural selection may influence the arrangement of individuals into mated pairs through either inter-sexual (mate choice) or intra-sexual selection (competition). A study of the American kestrel, Falco sparverius, in northern Saskatchewan distinguished between these two processes using size as a measure of the bird's competitive ability, and condition (mass scaled to body size) as an index of quality. Both sexes arrive on the study area after spring migration in equal numbers and males establish territories. Males and females that moved among territories at the time of pair formation were not different in size or condition from those that did not move, suggesting that birds were not being displaced by superior competitors, and that females moved to encounter potential mates. Within mated pairs, there was no relationship between a bird's size and the condition of its mate for either sex as would be predicted if intra-sexual competitition explained mating patterns. Instead, there was positive assortative mating by condition, suggesting that both sexes used quality as the criterion in choosing mates. There was no correlation between the sizes of males and females in mated paird. Because there were no differences in size or condition of breeding and non-breeding males, factors other than physical attributes, such as prior experience with the area, may determine a male's success in obtaining a territory. Because females that did not obtain mates were in poorer condition than those that did, males may have rejected poor quality females. The results suggest that intra-sexual competition was not important for pair formation, and that kestrels chose mates on the basis of quality.

  14. Selecting a Sample for Your Experiment: A Non-Random Stratified Sampling Approach

    ERIC Educational Resources Information Center

    Tipton, Elizabeth

    2012-01-01

    The purpose of this paper is to develop a more general method for sample recruitment in experiments that is purposive (not random) and that results in a sample that is compositionally similar to the generalization population. This work builds on Tipton et al. (2011) by offering solutions to a larger class of problems than the non-overlapping…

  15. Secular non-random variations of cosmogenic carbon-14 in the terrestrial atmosphere

    NASA Astrophysics Data System (ADS)

    Neftel, A.; Oeschger, H.; Suees, H. E.

    1981-12-01

    The time dependence of the C-14 content of bristlecone pine wood samples dated by their tree rings and grown during the last 8000 years was examined. Two different smoothing techniques were used for constructing values for equal time intervals. In this manner the introduction of regularities that could have resulted from applied mathematical techniques could be excluded. There is good evidence for nonrandom features in the power spectrum, in particular for a 200-year periodicity. The regularities in the power spectrum are further indications supporting the assumption that the C-14 variations reflect a property of the sun.

  16. Intervention for homeless, substance abusing mothers: findings from a non-randomized pilot.

    PubMed

    Slesnick, Natasha; Erdem, Gizem

    2012-01-01

    Little empirically-based information is available regarding how best to intervene with substance-abusing homeless mothers. This study pilot-tested a comprehensive intervention with 15 homeless women and their 2- to 6-year-old children, recruited from a local family shelter. All participants were offered integrated intervention with three major components. The first component was housing which included 3 months of rental and utility assistance, and these services were not contingent upon women's abstinence from drugs or alcohol. The second and third components included 6 months of case management services and an evidence-based substance abuse treatment (Community Reinforcement Approach; CRA). Analysis revealed that women showed reductions in substance use (F(2,22) = 3.63; p < .05), homelessness (F(2,24) = 25.31; p < .001), and mental health problems (F(2,20) = 8.5; p < .01). Further, women reported reduced internalizing (F(2,22) = 4.08; p < .05) and externalizing problems (F(2,24) = 7.7; p = .01) among their children. The findings suggest that the intervention is a promising approach to meet the multiple needs of this vulnerable population. These positive outcomes support the need for future research to replicate the findings with a larger sample using a randomized design.

  17. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient.

    PubMed

    Plass-Johnson, Jeremiah G; Taylor, Marc H; Husain, Aidah A A; Teichberg, Mirta C; Ferse, Sebastian C A

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased degradation of coral reefs is associated with increased variability in fish community functional composition resulting from selective impacts on specific traits, thereby affecting the functional response of these communities to increasing perturbations. PMID:27100189

  18. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    DOE PAGES

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M. -C.; Demkowicz, M. J.

    2016-01-29

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migrationmore » barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that ‘super-sink’ interfaces may be designed by optimizing interface stress fields. Lastly, such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage.« less

  19. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient

    PubMed Central

    Plass-Johnson, Jeremiah G.; Taylor, Marc H.; Husain, Aidah A. A.; Teichberg, Mirta C.; Ferse, Sebastian C. A.

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased degradation of coral reefs is associated with increased variability in fish community functional composition resulting from selective impacts on specific traits, thereby affecting the functional response of these communities to increasing perturbations. PMID:27100189

  20. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    PubMed Central

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M.-C.; Demkowicz, M. J.

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that ‘super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage. PMID:26822632

  1. Non-random food-web assembly at habitat edges increases connectivity and functional redundancy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Habitat fragmentation dramatically alters the spatial configuration of landscapes, with the creation of artificial edges affecting community structure and species interactions. Despite this, it is not known how the different food-webs in adjacent habitats merge at their boundaries, and what the cons...

  2. Non-random walk diffusion enhances the sink strength of semicoherent interfaces.

    PubMed

    Vattré, A; Jourdan, T; Ding, H; Marinica, M-C; Demkowicz, M J

    2016-01-29

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that 'super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage.

  3. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    NASA Astrophysics Data System (ADS)

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M.-C.; Demkowicz, M. J.

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that `super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage.

  4. Non-Random Sibling Cannibalism in the Marine Gastropod Crepidula coquimbensis.

    PubMed

    Brante, Antonio; Fernández, Miriam; Viard, Frédérique

    2013-01-01

    Sibling cannibalism is commonly observed in marine species. For instance, intrabrood cannibalism has been documented in marine gastropods with direct development, suggesting a relationship between embryo behavior and the evolution of life history strategies. However, there has been little effort to document the factors driving sibling cannibalism in marine species. The kin selection theory suggests that the level of relatedness plays an important role in cannibalism patterns. We examined Crepidula coquimbensis, a marine gastropod that broods and encloses its brooded offspring in capsules. Encapsulated embryos show sibling cannibalism and high levels of intracapsular multiple paternity. Given these features, cannibalistic behavior may be driven by kin-relatedness. To test this hypothesis, we constructed artificial aggregations of embryos to mimic three levels of relatedness: high, medium and low. For each category of aggregation, the cannibalism rate and benefits (i.e. size at hatching of surviving offspring) were estimated. In addition, at the end of embryo development, we performed parentage analyses to determine if cannibalism was associated with the relatedness between cannibal and victim embryos. Our results show that the intensity of sibling cannibalism increased in aggregations characterized by the lowest level of relatedness. There were important benefits of cannibalism in terms of hatching cannibal size. In addition, cannibalism between embryos was not random: the variation in reproductive success between males increased over the course of the experiment and the effective number of fathers decreased. Altogether, these results suggest that polyandry may play an important role in the evolution of sibling cannibalism in C. coquimbensis and that kin selection may operate during early embryonic stages in this species. PMID:23805291

  5. The NARCONON™ drug education curriculum for high school students: A non-randomized, controlled prevention trial

    PubMed Central

    Lennox, Richard D; Cecchini, Marie A

    2008-01-01

    Background An estimated 13 million youths aged 12 to 17 become involved with alcohol, tobacco and other drugs annually. The number of 12- to 17-year olds abusing controlled prescription drugs increased an alarming 212 percent between 1992 and 2003. For many youths, substance abuse precedes academic and health problems including lower grades, higher truancy, drop out decisions, delayed or damaged physical, cognitive, and emotional development, or a variety of other costly consequences. For thirty years the Narconon program has worked with schools and community groups providing single educational modules aimed at supplementing existing classroom-based prevention activities. In 2004, Narconon International developed a multi-module, universal prevention curriculum for high school ages based on drug abuse etiology, program quality management data, prevention theory and best practices. We review the curriculum and its rationale and test its ability to change drug use behavior, perceptions of risk/benefits, and general knowledge. Methods After informed parental consent, approximately 1000 Oklahoma and Hawai'i high school students completed a modified Center for Substance Abuse Prevention (CSAP) Participant Outcome Measures for Discretionary Programs survey at three testing points: baseline, one month later, and six month follow-up. Schools assigned to experimental conditions scheduled the Narconon curriculum between the baseline and one-month follow-up test; schools in control conditions received drug education after the six-month follow-up. Student responses were analyzed controlling for baseline differences using analysis of covariance. Results At six month follow-up, youths who received the Narconon drug education curriculum showed reduced drug use compared with controls across all drug categories tested. The strongest effects were seen in all tobacco products and cigarette frequency followed by marijuana. There were also significant reductions measured for alcohol and amphetamines. The program also produced changes in knowledge, attitudes and perception of risk. Conclusion The eight-module Narconon curriculum has thorough grounding in substance abuse etiology and prevention theory. Incorporating several historically successful prevention strategies this curriculum reduced drug use among youths. PMID:18348735

  6. Comparison of vaginal and abdominal hysterectomy:A prospective non-randomized trial

    PubMed Central

    Chen, Bing; Ren, Dong-Ping; Li, Jing-Xuan; Li, Chun-Dong

    2014-01-01

    Objective: To compare outcomes of vaginal and abdominal hysterectomy procedures in women with benign gynaecological diseases. Methods: This was a prospective study of outcomes of consecutive patients who underwent total vaginal hysterectomy (VH) or abdominal hysterectomy (AH) for benign gynaecological diseases. Patient characteristics before, during, and after the operations were reviewed. Patients were followed up for three months to evaluate postoperative complications. Results: This study included a total of 313 patients. 143 patients underwent AH and 170 patients underwent VH. Baseline characteristics were similar between the two groups. There were no intraoperative complications in either group. Operation time, intraoperative blood loss, first postoperative flatus time, time to out-of-bed activity, mean maximum postoperative body temperature, and duration of fever were all significantly shorter and less severe in the VH group compared with the AH group. In addition, vaginal length in the VH group was significantly shorter than in the AH group. Conclusions: Vaginal hysterectomy has advantages over AH in the treatment of benign gynaecological diseases, providing greater efficacy and safety with minimal invasiveness. PMID:25097536

  7. Communicating the Signal of Climate Change in The Presence of Non-Random Noise

    NASA Astrophysics Data System (ADS)

    Mann, M. E.

    2015-12-01

    The late Stephen Schneider spoke eloquently of the double ethical bind that we face: we must strive to communicate effectively but honestly. This is no simple task given the considerable "noise" generated in our public discourse by vested interests instead working to misinform the public. To do so, we must convey what is known in plainspoken jargon-free language, while acknowledging the real uncertainties that exist. Further, we must explain the implications of those uncertainties, which in many cases imply the possibility of greater, not lesser, risk. Finally, we must not be averse to discussing the policy implications of the science, lest we fail to provide our audience with critical information that can help them make informed choices about their own actions as citizens. I will use examples from my current collaboration with Washington Post editorial cartoonist Tom Toles.

  8. Brief Report: Non-Random X Chromosome Inactivation in Females with Autism

    ERIC Educational Resources Information Center

    Talebizadeh, Z.; Bittel, D. C.; Veatch, O. J.; Kibiryeva, N.; Butler, M. G.

    2005-01-01

    Autism is a heterogeneous neurodevelopmental disorder with a 3-4 times higher sex ratio in males than females. X chromosome genes may contribute to this higher sex ratio through unusual skewing of X chromosome inactivation. We studied X chromosome skewness in 30 females with classical autism and 35 similarly aged unaffected female siblings as…

  9. Non-randomized mtDNA damage after ionizing radiation via charge transport

    NASA Astrophysics Data System (ADS)

    Zhou, Xin; Liu, Xinguo; Zhang, Xin; Zhou, Rong; He, Yang; Li, Qiang; Wang, Zhenhua; Zhang, Hong

    2012-10-01

    Although it is well known that there are mutation hot spots in mtDNA, whether there are damage hot spots remain elusive. In this study, the regional DNA damage of mitochondrial genome after ionizing radiation was determined by real-time quantitative PCR. The mtDNA damage level was found to be dose-dependent and regional unequal. The control region was the most susceptible region to oxidative damage. GGG, as an typical hole trap during charge transport, was found to be disproportionally enriched in the control region. A total of 107 vertebrate mitochondrial genomes were then analyzed to testify whether the GGG enrichment in control region was evolutionary conserved. Surprisingly, the triple G enrichment can be observed in most of the homeothermal animals, while the majority of heterothermic animals showed no triple G enrichment. These results indicated that the triple G enrichment in control region was related to the mitochondrial metabolism during evolution.

  10. Combining randomized and non-randomized evidence in clinical research: a review of methods and applications.

    PubMed

    Verde, Pablo E; Ohmann, Christian

    2015-03-01

    Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it may be complex. As a consequence, combining disparate pieces of evidence becomes a challenge. In this review, we cover statistical methods that have been used for the evidence-synthesis of different study types with the same outcome and similar interventions. For the methodological review, a literature retrieval in the area of generalized evidence-synthesis was performed, and publications were identified, assessed, grouped and classified. Furthermore real applications of these methods in medicine were identified and described. For these approaches, 39 real clinical applications could be identified. A new classification of methods is provided, which takes into account: the inferential approach, the bias modeling, the hierarchical structure, and the use of graphical modeling. We conclude with a discussion of pros and cons of our approach and give some practical advice. PMID:26035469

  11. Juvenile rheumatoid arthritis and del(22q11) syndrome: a non-random association.

    PubMed Central

    Verloes, A; Curry, C; Jamar, M; Herens, C; O'Lague, P; Marks, J; Sarda, P; Blanchet, P

    1998-01-01

    Del(22q11) is a common microdeletion syndrome with an extremely variable phenotype. Besides classical manifestations, such as velocardiofacial (Shprintzen) or DiGeorge syndromes, del(22q11) syndrome may be associated with unusual but probably causally related anomalies that expand its phenotype and complicate its recognition. We report here three children with the deletion and a chronic, erosive polyarthritis resembling idiopathic cases of juvenile rheumatoid arthritis (JRA). Patient 1, born in 1983, initially presented with developmental delay, facial dysmorphism, velopharyngeal insufficiency, and severe gastro-oesophageal reflux requiring G tube feeding. From the age of 3 years, he developed JRA, which resulted in severe restrictive joint disease, osteopenia, and platyspondyly. Patient 2, born in 1976, had tetralogy of Fallot and peripheral pulmonary artery stenosis. She developed slowly, had mild dysmorphic facial features, an abnormal voice, and borderline intelligence. JRA was diagnosed at the age of 5 years. The disorder followed a subacute course, with relatively mild inflammatory phenomena, but an extremely severe skeletal involvement with major osteopenia, restrictive joint disease (bilateral hip replacement), and almost complete osteolysis of the carpal and tarsal bones with phalangeal synostoses, leading to major motor impairment and confinement to a wheelchair. Patient 3, born in 1990, has VSD, right embryo-toxon, bifid uvula, and facial dysmorphism. She developed JRA at the age of 1 year. She is not mentally retarded but has major speech delay secondary to congenital deafness inherited from her mother. In the three patients, a del(22q11) was shown by FISH analysis. These observations, and five other recently published cases, indicate that a JRA-like syndrome is a component of the del(22q11) spectrum. The deletion may be overlooked in those children with severe, chronic inflammatory disorder. Images PMID:9832043

  12. Non-random inactivation of large common fragile site genes in different cancers.

    PubMed

    McAvoy, S; Ganapathiraju, S C; Ducharme-Smith, A L; Pritchett, J R; Kosari, F; Perez, D S; Zhu, Y; James, C D; Smith, D I

    2007-01-01

    The common fragile sites are regions of profound genomic instability found in all individuals. The full size of each region of instability ranges from under one megabase (Mb) to greater than 10 Mbs. At least half of the CFS regions have been found to span extremely large genes that spanned from 600 kb to greater than 2.0 Mbs. The large CFS genes are also very interesting from a cancer perspective as several of them, including FHIT and WWOX, have already demonstrated the capacity to function as tumor suppressor genes, both in vitro and in vivo. We estimate that there may be 40-50 large genes localized in CFS regions. The expression of a number of the large CFS genes has been previously shown to be lost in many different cancers and this is frequently associated with a worse clinical outcome for patients. To determine if there was selection for the inactivation of different large CFS genes in different cancers, we examined the expression of 13 of the 20 known large CFS genes: FHIT, WWOX, PARK2, GRID2, NBEA, DLG2, RORA isoforms 1 and 4, DAB1, CNTNAP2, DMD, IL1RAPL1, IMMP2L and LARGE in breast, ovarian, endometrial and brain cancers using real-time RT-PCR analysis. Each cancer had a distinct profile of different large CFS genes that were inactivated. Interestingly, in breast, ovarian and endometrial cancers there were some cancers that had inactivation of expression of none or only one of the tested genes, while in other specimens there was inactivation of multiple tested genes. Brain cancers had inactivation of many of the tested genes, a number of which function in normal neurological development. We find that there is no relationship between the frequency that any specific CFS is expressed and the frequency that the gene from that region is inactivated in different cancers. Instead, it appears that different cancers select for the inactivation of different large CFS genes.

  13. Non-random walk diffusion enhances the sink strength of semicoherent interfaces.

    PubMed

    Vattré, A; Jourdan, T; Ding, H; Marinica, M-C; Demkowicz, M J

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that 'super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage. PMID:26822632

  14. Smart distribution systems

    DOE PAGES

    Jiang, Yazhou; Liu, Chen -Ching; Xu, Yin

    2016-04-19

    The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs) and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs) of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. Amore » comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD), is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Furthermore, test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs) is introduced. Future research in a smart distribution environment is proposed.« less

  15. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  16. Advanced Distribution Management System

    NASA Astrophysics Data System (ADS)

    Avazov, Artur R.; Sobinova, Liubov A.

    2016-02-01

    This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.

  17. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  18. Distribution and Marketing Syllabus.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    The distributive education program for grades 7 to 12 is organized around three career education phases: the career education phase (grades 7-10), the distributive phase (grade 11), and the competency cluster phase (grade 12). The grade 11 syllabus provides a six-page introduction which covers scheduling, cooperative work experience, the school…

  19. Wigner distributions for qudits

    SciTech Connect

    Chaturvedi, S.

    2006-11-15

    Two new approaches to the problem of setting up Wigner distributions for finite level quantum systems are proposed. Both arise by looking at the structure of the familiar Wigner distribution for Cartesian quantum mechanics from different perspectives. The two approaches have one common feature--each involves a 'square root' operation though of very different kinds.

  20. Groundwater and Distribution Workbook.

    ERIC Educational Resources Information Center

    Ekman, John E.

    Presented is a student manual designed for the Wisconsin Vocational, Technical and Adult Education Groundwater and Distribution Training Course. This program introduces waterworks operators-in-training to basic skills and knowledge required for the operation of a groundwater distribution waterworks facility. Arranged according to the general order…

  1. Distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Lacoss, Richard T.

    1986-09-01

    The Distributed Sensor Networks (DSN) program was aimed at developing distributed target surveillance and tracking methods for systems employing multiple spatially distributed sensors and processing resources. Such systems would be made up of sensors, data bases, and processors distributed throughout an area and interconnected by an appropriate digital data communication system. The hypothesis of the program was that through netting and distributed processing, the information from many sensors could be combined to yield effective surveillance systems. The overall concept called for a mix of sensor types as well as geographically distributed sensors. Surveillance and tracking of low-flying aircraft with ground-based acoustic and imaging sensors was used to develop and evaluate DSN concepts in the light of a specific problem. An experimental DSN testbed system was developed and has been used to test and demonstrate DSN techniques. Small arrays of microphones providing directional information were employed as acoustic sensors and visible TV cameras were used as imaging sensors in the testbed system. The primary accomplishment during this final report period was the demonstration of distributed real time tracking using both TV and acoustic sensors. Tracking was implemented as a geographically decentralized confederacy of autonomous cooperating nodes. Thus the feasibility of this organization has been established for a DSN system containing multiple sensor types as well as distributed nodes.

  2. DSIM: A distributed simulator

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Iyer, Ravishankar K.

    1990-01-01

    Discrete event-driven simulation makes it possible to model a computer system in detail. However, such simulation models can require a significant time to execute. This is especially true when modeling large parallel or distributed systems containing many processors and a complex communication network. One solution is to distribute the simulation over several processors. If enough parallelism is achieved, large simulation models can be efficiently executed. This study proposes a distributed simulator called DSIM which can run on various architectures. A simulated test environment is used to verify and characterize the performance of DSIM. The results of the experiments indicate that speedup is application-dependent and, in DSIM's case, is also dependent on how the simulation model is distributed among the processors. Furthermore, the experiments reveal that the communication overhead of ethernet-based distributed systems makes it difficult to achieve reasonable speedup unless the simulation model is computation bound.

  3. Plasma Sheet Energy Distributions

    NASA Astrophysics Data System (ADS)

    Sotirelis, T.; Lee, A. R.; Newell, P. T.

    2009-12-01

    Energy spectra of electrons and ions, as observed by DMSP, are fit to various distributions. The goal is to characterize the inner edge of the plasma sheet, so the focus is on large scale plasma sheet properties. Lower energy electron populations are ignored as they appear to be small-scale transients. Maxwellian, kappa and power-law distributed spectra are considered. Non-thermal ion distributions appear with greater frequency than anticipated. In order to be thermally distributed the differential energy flux must rise with a slope of ~2 toward a peak, after which the flux should fall sharply. The figure shows an apparently non-thermal ion distribution, together with a Maxwellian fit. The results from fits for one full year are presented.

  4. Distributed Propulsion Vehicles

    NASA Technical Reports Server (NTRS)

    Kim, Hyun Dae

    2010-01-01

    Since the introduction of large jet-powered transport aircraft, the majority of these vehicles have been designed by placing thrust-generating engines either under the wings or on the fuselage to minimize aerodynamic interactions on the vehicle operation. However, advances in computational and experimental tools along with new technologies in materials, structures, and aircraft controls, etc. are enabling a high degree of integration of the airframe and propulsion system in aircraft design. The National Aeronautics and Space Administration (NASA) has been investigating a number of revolutionary distributed propulsion vehicle concepts to increase aircraft performance. The concept of distributed propulsion is to fully integrate a propulsion system within an airframe such that the aircraft takes full synergistic benefits of coupling of airframe aerodynamics and the propulsion thrust stream by distributing thrust using many propulsors on the airframe. Some of the concepts are based on the use of distributed jet flaps, distributed small multiple engines, gas-driven multi-fans, mechanically driven multifans, cross-flow fans, and electric fans driven by turboelectric generators. This paper describes some early concepts of the distributed propulsion vehicles and the current turboelectric distributed propulsion (TeDP) vehicle concepts being studied under the NASA s Subsonic Fixed Wing (SFW) Project to drastically reduce aircraft-related fuel burn, emissions, and noise by the year 2030 to 2035.

  5. Long distance entanglement distribution

    NASA Astrophysics Data System (ADS)

    Broadfoot, Stuart Graham

    Developments in the interdisciplinary field of quantum information open up previously impossible abilities in the realms of information processing and communication. Quantum entanglement has emerged as one property of quantum systems that acts as a resource for quantum information processing and, in particular, enables teleportation and secure cryptography. Therefore, the creation of entangled resources is of key importance for the application of these technologies. Despite a great deal of research the efficient creation of entanglement over long distances is limited by inevitable noise. This problem can be overcome by creating entanglement between nodes in a network and then performing operations to distribute the entanglement over a long distance. This thesis contributes to the field of entanglement distribution within such quantum networks. Entanglement distribution has been extensively studied for one-dimensional networks resulting in "quantum repeater" protocols. However, little work has been done on higher dimensional networks. In these networks a fundamentally different scaling, called "long distance entanglement distribution", can appear between the resources and the distance separating the systems to be entangled. I reveal protocols that enable long distance entanglement distribution for quantum networks composed of mixed state and give a few limitations to the capabilities of entanglement distribution. To aid in the implementation of all entanglement distribution protocols I finish by introducing a new system, composed of an optical nanofibre coupled to a carbon nanotube, that may enable new forms of photo-detectors and quantum memories.

  6. Distributional Learning of Appearance

    PubMed Central

    Griffin, Lewis D.; Wahab, M. Husni; Newell, Andrew J.

    2013-01-01

    Opportunities for associationist learning of word meaning, where a word is heard or read contemperaneously with information being available on its meaning, are considered too infrequent to account for the rate of language acquisition in children. It has been suggested that additional learning could occur in a distributional mode, where information is gleaned from the distributional statistics (word co-occurrence etc.) of natural language. Such statistics are relevant to meaning because of the Distributional Principle that ‘words of similar meaning tend to occur in similar contexts’. Computational systems, such as Latent Semantic Analysis, have substantiated the viability of distributional learning of word meaning, by showing that semantic similarities between words can be accurately estimated from analysis of the distributional statistics of a natural language corpus. We consider whether appearance similarities can also be learnt in a distributional mode. As grounds for such a mode we advance the Appearance Hypothesis that ‘words with referents of similar appearance tend to occur in similar contexts’. We assess the viability of such learning by looking at the performance of a computer system that interpolates, on the basis of distributional and appearance similarity, from words that it has been explicitly taught the appearance of, in order to identify and name objects that it has not been taught about. Our experiment tests with a set of 660 simple concrete noun words. Appearance information on words is modelled using sets of images of examples of the word. Distributional similarity is computed from a standard natural language corpus. Our computation results support the viability of distributional learning of appearance. PMID:23460927

  7. Beam distributions beyond RMS

    SciTech Connect

    Decker, F.

    1995-05-05

    The beam is often represented only by its position (mean) and the width (rms=root mean squared) of its distribution. To achieve these beam parameters in a noisy condition with high backgrounds, a Gaussian distribution with offset (4 parameters) is fitted to the measured beam distribution. This gives a very robust answer and is not very sensitive to background subtraction techniques. To get higher moments of the distribution, like skew or kurtosis, a fitting function with one or two more parameters is desired which would model the higher moments. In this paper we will concentrate on an Asymmetric Gaussian and a Super Gaussian function that will give something like the skew and the kurtosis of the distribution. This information is used to quantify special beam distribution. Some are unwanted like beam tails (skew) from transverse wakefields, higher order dispersive aberrations or potential well distortion in a damping ring. A negative kurtosis of a beam distribution describes a more rectangular, compact shape like with an over-compressed beam in {ital z} or a closed to double-horned energy distribution, while a positive kurtosis looks more like a ``Christmas tree`` and can quantify a beam mismatch after filamentation. Besides the advantages of the quantification, there are some distributions which need a further investigation like long flat tails which create background particles in a detector. In particle simulations on the other hand a simple rms number might grossly overestimate the effective size (e.g. for producing luminosity) due to a few particles which are far away from the core. This can reduce the practical gain of a big theoretical improvement in the beam size. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.

  8. Beam distributions beyond RMS

    SciTech Connect

    Decker, F.J.

    1994-09-01

    The beam is often represented only by its position (mean) and the width (rms = root mean squared) of its distribution. To achieve these beam parameters in a noisy condition with high backgrounds, a Gaussian distribution with offset (4 parmeters) is fitted to the measured beam distribution. This gives a very robust answer and is not very sensitive to background subtraction techniques. To get higher moments of the distribution, like skew or kurtosis, a fitting function with one or two more parameters is desired which would model the higher moments. In this paper we will concentrate on an Asymmetric Gaussian and a Super Gaussian function that will give something like the skew and the kurtosis of the distribution. This information is used to quantify special beam distribution. Some are unwanted like beam tails (skew) from transverse wakefields, higher order dispersive aberrations or potential well distortion in a damping ring. A negative kurtosis of a beam distribution describes a more rectangular, compact shape like with an over-compressed beam in z or a closed to double-homed energy distribution, while a positive kurtosis looks more like a ``Christmas tree`` and can quantify a beam mismatch after filamentation. Besides the advantages of the quantification, there are some distributions which need a further investigation like long flat tails which create background particles in a detector. In particle simulations on the other hand a simple rms number might grossly overestimate the effective size (e.g. for producing luminosity) due to a few particles which are far away from the core. This can reduce the practical gain of a big theoretical improvement in the beam size.

  9. Technologies for distributed defense

    NASA Astrophysics Data System (ADS)

    Seiders, Barbara; Rybka, Anthony

    2002-07-01

    For Americans, the nature of warfare changed on September 11, 2001. Our national security henceforth will require distributed defense. One extreme of distributed defense is represented by fully deployed military troops responding to a threat from a hostile nation state. At the other extreme is a country of 'citizen soldiers', with families and communities securing their common defense through heightened awareness, engagement as good neighbors, and local support of and cooperation with local law enforcement, emergency and health care providers. Technologies - for information exploitation, biological agent detection, health care surveillance, and security - will be critical to ensuring success in distributed defense.

  10. Technologies for Distributed Defense

    SciTech Connect

    Seiders, Barbara AB; Rybka, Anthony J.

    2002-07-01

    For Americans, the nature of warfare changed on September 11, 2001. Our national security henceforth will require distributed defense. One extreme of distributed defense is represented by fully deployed military troops responding to a threat from a hostile nation state. At the other extreme is a country of "citizen soldiers," with families and communities securing their common defense through heightened awareness, engagement as good neighbors, and local support of and cooperation with local law enforcement, emergency and health care providers. Technologies - for information exploitation, biological agent detection, health care surveillance, and security - will be critical to ensuring success in distributed defense.

  11. Spatial and seasonal dynamic of abundance and distribution of guanaco and livestock: insights from using density surface and null models.

    PubMed

    Schroeder, Natalia M; Matteucci, Silvia D; Moreno, Pablo G; Gregorio, Pablo; Ovejero, Ramiro; Taraborelli, Paula; Carmanchahi, Pablo D

    2014-01-01

    Monitoring species abundance and distribution is a prerequisite when assessing species status and population viability, a difficult task to achieve for large herbivores at ecologically meaningful scales. Co-occurrence patterns can be used to infer mechanisms of community organization (such as biotic interactions), although it has been traditionally applied to binary presence/absence data. Here, we combine density surface and null models of abundance data as a novel approach to analyze the spatial and seasonal dynamics of abundance and distribution of guanacos (Lama guanicoe) and domestic herbivores in northern Patagonia, in order to visually and analytically compare the dispersion and co-occurrence pattern of ungulates. We found a marked seasonal pattern in abundance and spatial distribution of L. guanicoe. The guanaco population reached its maximum annual size and spatial dispersion in spring-summer, decreasing up to 6.5 times in size and occupying few sites of the study area in fall-winter. These results are evidence of the seasonal migration process of guanaco populations, an increasingly rare event for terrestrial mammals worldwide. The maximum number of guanacos estimated for spring (25,951) is higher than the total population size (10,000) 20 years ago, probably due to both counting methodology and population growth. Livestock were mostly distributed near human settlements, as expected by the sedentary management practiced by local people. Herbivore distribution was non-random; i.e., guanaco and livestock abundances co-varied negatively in all seasons, more than expected by chance. Segregation degree of guanaco and small-livestock (goats and sheep) was comparatively stronger than that of guanaco and large-livestock, suggesting a competition mechanism between ecologically similar herbivores, although various environmental factors could also contribute to habitat segregation. The new and compelling combination of methods used here is highly useful for

  12. Spatial and Seasonal Dynamic of Abundance and Distribution of Guanaco and Livestock: Insights from Using Density Surface and Null Models

    PubMed Central

    Schroeder, Natalia M.; Matteucci, Silvia D.; Moreno, Pablo G.; Gregorio, Pablo; Ovejero, Ramiro; Taraborelli, Paula; Carmanchahi, Pablo D.

    2014-01-01

    Monitoring species abundance and distribution is a prerequisite when assessing species status and population viability, a difficult task to achieve for large herbivores at ecologically meaningful scales. Co-occurrence patterns can be used to infer mechanisms of community organization (such as biotic interactions), although it has been traditionally applied to binary presence/absence data. Here, we combine density surface and null models of abundance data as a novel approach to analyze the spatial and seasonal dynamics of abundance and distribution of guanacos (Lama guanicoe) and domestic herbivores in northern Patagonia, in order to visually and analytically compare the dispersion and co-occurrence pattern of ungulates. We found a marked seasonal pattern in abundance and spatial distribution of L. guanicoe. The guanaco population reached its maximum annual size and spatial dispersion in spring-summer, decreasing up to 6.5 times in size and occupying few sites of the study area in fall-winter. These results are evidence of the seasonal migration process of guanaco populations, an increasingly rare event for terrestrial mammals worldwide. The maximum number of guanacos estimated for spring (25951) is higher than the total population size (10000) 20 years ago, probably due to both counting methodology and population growth. Livestock were mostly distributed near human settlements, as expected by the sedentary management practiced by local people. Herbivore distribution was non-random; i.e., guanaco and livestock abundances co-varied negatively in all seasons, more than expected by chance. Segregation degree of guanaco and small-livestock (goats and sheep) was comparatively stronger than that of guanaco and large-livestock, suggesting a competition mechanism between ecologically similar herbivores, although various environmental factors could also contribute to habitat segregation. The new and compelling combination of methods used here is highly useful for researchers

  13. Financing Distributed Generation

    SciTech Connect

    Walker, A.

    2001-06-29

    This paper introduces the engineer who is undertaking distributed generation projects to a wide range of financing options. Distributed generation systems (such as internal combustion engines, small gas turbines, fuel cells and photovoltaics) all require an initial investment, which is recovered over time through revenues or savings. An understanding of the cost of capital and financing structures helps the engineer develop realistic expectations and not be offended by the common requirements of financing organizations. This paper discusses several mechanisms for financing distributed generation projects: appropriations; debt (commercial bank loan); mortgage; home equity loan; limited partnership; vendor financing; general obligation bond; revenue bond; lease; Energy Savings Performance Contract; utility programs; chauffage (end-use purchase); and grants. The paper also discusses financial strategies for businesses focusing on distributed generation: venture capital; informal investors (''business angels''); bank and debt financing; and the stock market.

  14. Ticks: Geographic Distribution

    MedlinePlus

    ... Atlas. Download this map [PDF - 1 page] Lone star tick ( Amblyomma americanum ) Where found: Widely distributed in ... is distinguished by a white dot or “lone star” on her back. Lone star tick saliva can ...

  15. Estimating Bias Error Distributions

    NASA Technical Reports Server (NTRS)

    Liu, Tian-Shu; Finley, Tom D.

    2001-01-01

    This paper formulates the general methodology for estimating the bias error distribution of a device in a measuring domain from less accurate measurements when a minimal number of standard values (typically two values) are available. A new perspective is that the bias error distribution can be found as a solution of an intrinsic functional equation in a domain. Based on this theory, the scaling- and translation-based methods for determining the bias error distribution arc developed. These methods are virtually applicable to any device as long as the bias error distribution of the device can be sufficiently described by a power series (a polynomial) or a Fourier series in a domain. These methods have been validated through computational simulations and laboratory calibration experiments for a number of different devices.

  16. DOLIB: Distributed Object Library

    SciTech Connect

    D`Azevedo, E.F.; Romine, C.H.

    1994-10-01

    This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.

  17. Parton Distributions Working Group

    SciTech Connect

    de Barbaro, L.; Keller, S. A.; Kuhlmann, S.; Schellman, H.; Tung, W.-K.

    2000-07-20

    This report summarizes the activities of the Parton Distributions Working Group of the QCD and Weak Boson Physics workshop held in preparation for Run II at the Fermilab Tevatron. The main focus of this working group was to investigate the different issues associated with the development of quantitative tools to estimate parton distribution functions uncertainties. In the conclusion, the authors introduce a Manifesto that describes an optimal method for reporting data.

  18. Polygamy of distributed entanglement

    NASA Astrophysics Data System (ADS)

    Buscemi, Francesco; Gour, Gilad; Kim, Jeong San

    2009-07-01

    While quantum entanglement is known to be monogamous (i.e., shared entanglement is restricted in multipartite settings), here we show that distributed entanglement (or the potential for entanglement) is by nature polygamous. By establishing the concept of one-way unlocalizable entanglement (UE) and investigating its properties, we provide a polygamy inequality of distributed entanglement in tripartite quantum systems of arbitrary dimension. We also provide a polygamy inequality in multiqubit systems and several trade-offs between UE and other correlation measures.

  19. Sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1988-01-01

    Theoretical models of the human brain and proposed neural-network computers are developed analytically. Chapters are devoted to the mathematical foundations, background material from computer science, the theory of idealized neurons, neurons as address decoders, and the search of memory for the best match. Consideration is given to sparse memory, distributed storage, the storage and retrieval of sequences, the construction of distributed memory, and the organization of an autonomous learning system.

  20. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  1. Sparse distributed memory

    SciTech Connect

    Kanerva, P.

    1988-01-01

    Theoretical models of the human brain and proposed neural-network computers are developed analytically. Chapters are devoted to the mathematical foundations, background material from computer science, the theory of idealized neurons, neurons as address decoders, and the search of memory for the best match. Consideration is given to sparse memory, distributed storage, the storage and retrieval of sequences, the construction of distributed memory, and the organization of an autonomous learning system. 63 refs.

  2. Polygamy of distributed entanglement

    SciTech Connect

    Buscemi, Francesco; Gour, Gilad; Kim, Jeong San

    2009-07-15

    While quantum entanglement is known to be monogamous (i.e., shared entanglement is restricted in multipartite settings), here we show that distributed entanglement (or the potential for entanglement) is by nature polygamous. By establishing the concept of one-way unlocalizable entanglement (UE) and investigating its properties, we provide a polygamy inequality of distributed entanglement in tripartite quantum systems of arbitrary dimension. We also provide a polygamy inequality in multiqubit systems and several trade-offs between UE and other correlation measures.

  3. NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2009-07-01

    The performance of MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the COS SMOV as proposal 13555 {visit 5}.

  4. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2010-09-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Analysis {11863} during Cycle 17.

  5. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2009-07-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Analysis {10035} during Cycle 12.

  6. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2011-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Analysis, Proposal 12416, as Cycle 18.

  7. Study of 500 patients with limb joint osteoarthritis. I. Analysis by age, sex, and distribution of symptomatic joint sites.

    PubMed Central

    Cushnaghan, J; Dieppe, P

    1991-01-01

    Five hundred subjects with symptomatic limb joint osteoarthritis, who had been referred to a rheumatologist, were enrolled into a continuing study. They comprised 342 women (mean age 65.3) and 158 men (mean age 59.7), with a mean symptom duration of 15.4 years at entry. Only 31 patients (6%) had symptomatic osteoarthritis of one joint alone; however, in a further 205 (41%) the disease was limited to one site. One hundred and eighty two (36.4%) had two sites affected and 82 (16.4%) three or more sites of symptomatic osteoarthritis. Of 847 affected joints the most commonly involved were 349 (41.2%) knees, 254 (30%) hands, and 161 (19%) hips. Hip disease stood out as a separate entity, often occurring alone, and having a stronger male preponderance and different associations than osteoarthritis at other joint sites. Knee and hand disease were significantly associated in women. Obesity, hypertension, and Heberden's nodes were common. The number of sites affected, as well as the distribution, was strongly related to age as well as sex, suggesting that polyarticular osteoarthritis arises from slow acquisition of new joint sites in a non-random distribution. 'Generalised' osteoarthritis did not emerge as a distinct entity. PMID:1994877

  8. The isotopic distribution conundrum.

    PubMed

    Valkenborg, Dirk; Mertens, Inge; Lemière, Filip; Witters, Erwin; Burzykowski, Tomasz

    2012-01-01

    Although access to high-resolution mass spectrometry (MS), especially in the field of biomolecular MS, is becoming readily available due to recent advances in MS technology, the accompanied information on isotopic distribution in high-resolution spectra is not used at its full potential, mainly because of lack of knowledge and/or awareness. In this review, we give an insight into the practical problems related to calculating the isotopic distribution for large biomolecules, and present an overview of methods for the calculation of the isotopic distribution. We discuss the key events that triggered the development of various algorithms and explain the rationale of how and why the various isotopic-distribution calculations were performed. The review is focused around the developmental stages as briefly outlined below, starting with the first observation of an isotopic distribution. The observations of Beynon in the field of organic MS that chlorine appeared in a mass spectrum as two variants with odds 3:1 lie at the basis of the first wave of algorithms for the calculation of the isotopic distribution, based on the atomic composition of a molecule. From here on, we explain why more complex biomolecules such as peptides exhibit a highly complex isotope pattern when assayed by MS, and we discuss how combinatorial difficulties complicate the calculation of the isotopic distribution on computers. For this purpose, we highlight three methods, which were introduced in the 1980s. These are the stepwise procedure introduced by Kubinyi, the polynomial expansion from Brownawell and Fillippo, and the multinomial expansion from Yergey. The next development was instigated by Rockwood, who suggested to decompose the isotopic distribution in terms of their nucleon count instead of the exact mass. In this respect, we could claim that the term "aggregated" isotopic distribution is more appropriate. Due to the simplification of the isotopic distribution to its aggregated counterpart

  9. Distributed replica dynamics

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Chill, Samuel T.; Henkelman, Graeme

    2015-11-01

    A distributed replica dynamics (DRD) method is proposed to calculate rare-event molecular dynamics using distributed computational resources. Similar to Voter's parallel replica dynamics (PRD) method, the dynamics of independent replicas of the system are calculated on different computational clients. In DRD, each replica runs molecular dynamics from an initial state for a fixed simulation time and then reports information about the trajectory back to the server. A simulation clock on the server accumulates the simulation time of each replica until one reports a transition to a new state. Subsequent calculations are initiated from within this new state and the process is repeated to follow the state-to-state evolution of the system. DRD is designed to work with asynchronous and distributed computing resources in which the clients may not be able to communicate with each other. Additionally, clients can be added or removed from the simulation at any point in the calculation. Even with heterogeneous computing clients, we prove that the DRD method reproduces the correct probability distribution of escape times. We also show this correspondence numerically; molecular dynamics simulations of Al(100) adatom diffusion using PRD and DRD give consistent exponential distributions of escape times. Finally, we discuss guidelines for choosing the optimal number of replicas and replica trajectory length for the DRD method.

  10. Sparse distributed memory overview

    NASA Technical Reports Server (NTRS)

    Raugh, Mike

    1990-01-01

    The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.

  11. Break Point Distribution on Chromosome 3 of Human Epithelial Cells exposed to Gamma Rays, Neutrons and Fe Ions

    NASA Technical Reports Server (NTRS)

    Hada, M.; Saganti, P. B.; Gersey, B.; Wilkins, R.; Cucinotta, F. A.; Wu, H.

    2007-01-01

    Most of the reported studies of break point distribution on the damaged chromosomes from radiation exposure were carried out with the G-banding technique or determined based on the relative length of the broken chromosomal fragments. However, these techniques lack the accuracy in comparison with the later developed multicolor banding in situ hybridization (mBAND) technique that is generally used for analysis of intrachromosomal aberrations such as inversions. Using mBAND, we studied chromosome aberrations in human epithelial cells exposed in vitro to both low or high dose rate gamma rays in Houston, low dose rate secondary neutrons at Los Alamos National Laboratory and high dose rate 600 MeV/u Fe ions at NASA Space Radiation Laboratory. Detailed analysis of the inversion type revealed that all of the three radiation types induced a low incidence of simple inversions. Half of the inversions observed after neutron or Fe ion exposure, and the majority of inversions in gamma-irradiated samples were accompanied by other types of intrachromosomal aberrations. In addition, neutrons and Fe ions induced a significant fraction of inversions that involved complex rearrangements of both inter- and intrachromosome exchanges. We further compared the distribution of break point on chromosome 3 for the three radiation types. The break points were found to be randomly distributed on chromosome 3 after neutrons or Fe ions exposure, whereas non-random distribution with clustering break points was observed for gamma-rays. The break point distribution may serve as a potential fingerprint of high-LET radiation exposure.

  12. Distributed Wind Market Applications

    SciTech Connect

    Forsyth, T.; Baring-Gould, I.

    2007-11-01

    Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, central station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.

  13. Partonic Transverse Momentum Distributions

    SciTech Connect

    Rossi, Patrizia

    2010-08-04

    In recent years parton distributions have been generalized to account also for transverse degrees of freedom and new sets of more general distributions, Transverse Momentum Dependent (TMD) parton distributions and fragmentation functions were introduced. Different experiments worldwide (HERMES, COMPASS, CLAS, JLab-Hall A) have measurements of TMDs in semi-inclusive DIS processes as one of their main focuses of research. TMD studies are also an important part of the present and future Drell-Yan experiments at RICH and JPARC and GSI, respectively, Studies of TMDs are also one of the main driving forces of the Jefferson Lab (JLab) 12 GeV upgrade project. Progress in phenomenology and theory is flourishing as well. In this talk an overview of the latest developments in studies of TMDs will be given and newly released results, ongoing activities, as well as planned near term and future measurements will be discussed.

  14. Discrete Pearson distributions

    SciTech Connect

    Bowman, K.O. ); Shenton, L.R. ); Kastenbaum, M.A. , Basye, VA )

    1991-11-01

    These distributions are generated by a first order recursive scheme which equates the ratio of successive probabilities to the ratio of two corresponding quadratics. The use of a linearized form of this model will produce equations in the unknowns matched by an appropriate set of moments (assumed to exist). Given the moments we may find valid solutions. These are two cases; (1) distributions defined on the non-negative integers (finite or infinite) and (2) distributions defined on negative integers as well. For (1), given the first four moments, it is possible to set this up as equations of finite or infinite degree in the probability of a zero occurrence, the sth component being a product of s ratios of linear forms in this probability in general. For (2) the equation for the zero probability is purely linear but may involve slowly converging series; here a particular case is the discrete normal. Regions of validity are being studied. 11 refs.

  15. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne; Dunson, David

    2008-06-03

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  16. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne; Dunson, David

    2006-08-08

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  17. Distributed Sensors Simulator

    2003-08-30

    The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for distributed sensor networks without the commitment inherent in using hardware. The flexibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness, and scaling issues; explore arbitrary algorithms for DSNs; and is particularly useful as a proof-of-concept tool. The user provides data on node location and specifications, defines event phenomena, and plugs in the application(s)more » to run. DSS in turn provides the virtual environmental embedding — but exposed to the user like no true embedding could ever be.« less

  18. Program for standard statistical distributions

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1972-01-01

    Development of procedure to describe frequency distributions involved in statistical theory is discussed. Representation of frequency distributions by first order differential equation is presented. Classification of various types of distributions based on Pearson parameters is analyzed.

  19. 76 FR 42768 - Capital Distribution

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Office of Thrift Supervision Capital Distribution AGENCY: Office of Thrift Supervision (OTS), Treasury... concerning the following information collection. Title of Proposal: Capital Distribution. OMB Number: 1550..., the information provides the OTS with a mechanism for monitoring capital distributions since...

  20. Distribution of Childrearing Demands.

    ERIC Educational Resources Information Center

    Zimmerman, Judith D.; And Others

    The tools of economic analysis were applied to demographic data in order to develop a social indicator measuring the extent of inequality in the distribution of childrearing responsibility in households from 1940 to 1980. With data drawn from the Current Population Survey of the Bureau of the Census, a "demand intensity" measure was developed.…

  1. Density distributions in nuclei

    NASA Astrophysics Data System (ADS)

    Strutinsky, V. M.; Magner, A. G.; Denisov, V. Yu.

    1985-03-01

    Density distribution across the nuclear surface is obtained in the approximation of relatively sharp nuclear edge. It is used to determine dynamical parts of the density relevant to density vibration resonances. Results of the simple calculations are in close agreement with detailed microscopic theories.

  2. Prototyping distributed simulation networks

    NASA Technical Reports Server (NTRS)

    Doubleday, Dennis L.

    1990-01-01

    Durra is a declarative language designed to support application-level programming. The use of Durra is illustrated to describe a simple distributed application: a simulation of a collection of networked vehicle simulators. It is shown how the language is used to describe the application, its components and structure, and how the runtime executive provides for the execution of the application.

  3. Momentum distributions: An overview

    NASA Astrophysics Data System (ADS)

    Sokol, P. E.; Silver, R. N.; Clark, J. W.

    There have been several excellent reviews of momentum-distribution research in particular subject areas of physics such as electronic systems and nuclear systems. However, it is the commonality of interests, difficulties, and prospects across all of physics, along with certain pivotal advances, which led to the organization of an interdisciplinary Workshop on Momentum Distributions held at Argonne National Laboratory on 24 and 26 October 1988. The purpose of this overview is to explain why scientists with such diverse backgrounds were brought together at this meeting, to introduce and discuss the common elements of momentum-distribution studies, and to establish a common language. We hope to facilitate an appreciation of the more specialized articles which follow in these proceedings. We begin by summarizing the general properties of momentum distributions. Differences and similarities of atomic, electronic, and nuclear many-body systems are examined, in terms of characteristic lengths and energies, relative importance of exchange, and the nature of the two-particle interactions. We continue with a brief commentary on the microscopic methods used to calculate n(p) from first principles.

  4. Aerosol distribution apparatus

    DOEpatents

    Hanson, W.D.

    An apparatus for uniformly distributing an aerosol to a plurality of filters mounted in a plenum, wherein the aerosol and air are forced through a manifold system by means of a jet pump and released into the plenum through orifices in the manifold. The apparatus allows for the simultaneous aerosol-testing of all the filters in the plenum.

  5. Age Distribution of Groundwater

    NASA Astrophysics Data System (ADS)

    Morgenstern, U.; Daughney, C. J.

    2012-04-01

    Groundwater at the discharge point comprises a mixture of water from different flow lines with different travel time and therefore has no discrete age but an age distribution. The age distribution can be assessed by measuring how a pulse shaped tracer moves through the groundwater system. Detection of the time delay and the dispersion of the peak in the groundwater compared to the tracer input reveals the mean residence time and the mixing parameter. Tritium from nuclear weapons testing in the early 1960s resulted in a peak-shaped tritium input to the whole hydrologic system on earth. Tritium is the ideal tracer for groundwater because it is an isotope of hydrogen and therefore is part of the water molecule. Tritium time series data that encompass the passage of the bomb tritium pulse through the groundwater system in all common hydrogeologic situations in New Zealand demonstrate a semi-systematic pattern between age distribution parameters and hydrologic situation. The data in general indicate high fraction of mixing, but in some cases also indicate high piston flow. We will show that still, 45 years after the peak of the bomb tritium, it is possible to assess accurately the parameters of age distributions by measuring the tail of the bomb tritium.

  6. Industrial power distribution

    SciTech Connect

    Sorrells, M.A.

    1990-01-01

    This paper is a broad overview of industrial power distribution. Primary focus will be on selection of the various low voltage components to achieve the end product. Emphasis will be on the use of national standards to ensure a safe and well designed installation.

  7. Distributed fuzzy system modeling

    SciTech Connect

    Pedrycz, W.; Chi Fung Lam, P.; Rocha, A.F.

    1995-05-01

    The paper introduces and studies an idea of distributed modeling treating it as a new paradigm of fuzzy system modeling and analysis. This form of modeling is oriented towards developing individual (local) fuzzy models for specific modeling landmarks (expressed as fuzzy sets) and determining the essential logical relationships between these local models. The models themselves are implemented in the form of logic processors being regarded as specialized fuzzy neural networks. The interaction between the processors is developed either in an inhibitory or excitatory way. In more descriptive way, the distributed model can be sought as a collection of fuzzy finite state machines with their individual local first or higher order memories. It is also clarified how the concept of distributed modeling narrows down a gap between purely numerical (quantitative) models and the qualitative ones originated within the realm of Artificial Intelligence. The overall architecture of distributed modeling is discussed along with the detailed learning schemes. The results of extensive simulation experiments are provided as well. 17 refs.

  8. Distributive Education. Selling. Curriculum.

    ERIC Educational Resources Information Center

    Lankford, Dave; Comte, Don

    Nineteen lesson plans on selling are presented in this performance-based curriculum unit for distributive education. This unit is self-contained and consists of the following components: introduction (provides overview of unit content and describes why mastery of the objectives is important); performance objectives; pre-assessment instrument…

  9. Distributed analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  10. Multiagent distributed watershed management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Castelletti, A.; Amigoni, F.; Cai, X.

    2012-04-01

    Deregulation and democratization of water along with increasing environmental awareness are challenging integrated water resources planning and management worldwide. The traditional centralized approach to water management, as described in much of water resources literature, is often unfeasible in most of the modern social and institutional contexts. Thus it should be reconsidered from a more realistic and distributed perspective, in order to account for the presence of multiple and often independent Decision Makers (DMs) and many conflicting stakeholders. Game theory based approaches are often used to study these situations of conflict (Madani, 2010), but they are limited to a descriptive perspective. Multiagent systems (see Wooldridge, 2009), instead, seem to be a more suitable paradigm because they naturally allow to represent a set of self-interested agents (DMs and/or stakeholders) acting in a distributed decision process at the agent level, resulting in a promising compromise alternative between the ideal centralized solution and the actual uncoordinated practices. Casting a water management problem in a multiagent framework allows to exploit the techniques and methods that are already available in this field for solving distributed optimization problems. In particular, in Distributed Constraint Satisfaction Problems (DCSP, see Yokoo et al., 2000), each agent controls some variables according to his own utility function but has to satisfy inter-agent constraints; while in Distributed Constraint Optimization Problems (DCOP, see Modi et al., 2005), the problem is generalized by introducing a global objective function to be optimized that requires a coordination mechanism between the agents. In this work, we apply a DCSP-DCOP based approach to model a steady state hypothetical watershed management problem (Yang et al., 2009), involving several active human agents (i.e. agents who make decisions) and reactive ecological agents (i.e. agents representing

  11. Proximity Within Interphase Chromosome Contributes to the Breakpoint Distribution in Radiation-Induced Intrachromosomal Exchanges

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Uhlemeyer, Jimmy; Hada, Megumi; Asaithamby, A.; Chen, David J.; Wu, Honglu

    2015-01-01

    Previously, we reported that breaks involved in chromosome aberrations were clustered in several regions of chromosome3 in human mammary epithelial cells after exposures to either low-or high-LET radiation. In particular, breaks in certain regions of the chromosome tended to rejoin with each other to form an intrachromosome exchange event. This study tests the hypothesis that proximity within a single chromosome in interphase cell nuclei contributes to the distribution of radiation-induced chromosome breaks. Chromosome 3 in G1 human mammary epithelial cells was hybridized with the multicolor banding in situ hybridization (mBAND) probes that distinguish the chromosome in six differently colored regions, and the location of these regions was measured with a laser confocal microscope. Results of the study indicated that, on a multi-mega base pair scale of the DNA, the arrangement of chromatin was non-random. Both telomere regions tended to be located towards the exterior of the chromosome domain, whereas the centromere region towards the interior. In addition, the interior of the chromosome domain was preferentially occupied by the p-arm of the chromatin, which is consistent with our previous finding of intrachromosome exchanges involving breaks on the p-arm and in the centromere region of chromosome3. Other factors, such as the fragile sites in the 3p21 band and gene regulation, may also contribute to the breakpoint distribution in radiation-induced chromosome aberrations. Further investigations suggest that the 3D chromosome folding is cell type and culture condition dependent.

  12. Spatial Distribution of Reef Fish Species along the Southeast US Atlantic Coast Inferred from Underwater Video Survey Data

    PubMed Central

    Bacheler, Nathan M.; Schobernd, Zebulon H.; Berrane, David J.; Schobernd, Christina M.; Mitchell, Warren A.; Teer, Bradford Z.; Gregalis, Kevan C.; Glasgow, Dawn M.

    2016-01-01

    Marine fish abundance and distribution often varies across spatial scales for a variety of reasons, and this variability has significant ecological and management consequences. We quantified the distribution of reef-associated fish species along the southeast United States Atlantic coast using underwater video survey samples (N = 4,855 in 2011–2014) to elucidate variability within species across space, depths, and habitats, as well as describe broad-scale patterns in species richness. Thirty-two species were seen at least 10 times on video, and the most commonly observed species were red porgy (Pagrus pagrus; 41.4% of videos), gray triggerfish (Balistes capriscus; 31.0%), black sea bass (Centropristis striata; 29.1%), vermilion snapper (Rhomboplites aurorubens; 27.7%), and red snapper (Lutjanus campechanus; 22.6%). Using generalized additive models, we found that most species were non-randomly distributed across space, depths, and habitats. Most rare species were observed along the continental shelf break, except for goliath grouper (Epinephelus itajara), which was found on the continental shelf in Florida and Georgia. We also observed higher numbers of species in shelf-break habitats from southern North Carolina to Georgia, and fewer in shallower water and at the northern and southern ends of the southeast United States Atlantic coast. Our study provides the first broad-scale description of the spatial distribution of reef fish in the region to be based on fishery-independent data, reinforces the utility of underwater video to survey reef fish, and can help improve the management of reef fish in the SEUS, for example, by improving indices of abundance. PMID:27655268

  13. Bounding species distribution models

    USGS Publications Warehouse

    Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.

  14. Bounding Species Distribution Models

    NASA Technical Reports Server (NTRS)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  15. Distributed Visualization Project

    NASA Technical Reports Server (NTRS)

    Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca

    2016-01-01

    Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.

  16. Radio frequency distribution assembly

    NASA Astrophysics Data System (ADS)

    Culley, K. M.

    The Naval Research Laboratory (NRL) Radio Frequency Distribution Assembly (RFDA) is an interface between the Sperry four-channel, fast-switching synthesizer and the EF-111 jamming system antenna ports. The RFDS is a sophisticated, high-speed RF interface designed to convert the banded outputs of the four-channel synthesizer (16 ports) to 36 ports which represent six ordinal directions of arrival (DOA) for the EF-111 jamming system. The RFDS will distribute the RF signals while providing controlled RF amplitudes to simulate the antenna patterns of the EF-111 Electronic Warfare (EW) system. The simulation of the arrival angles which appear between the ordinal directions is performed by controlling the amplitude of the RF signal from the DOA channels. The RFDA is capable of operating over the frequency range of 500MHz to 18GHz, and can rapidly switch between varying frequencies and attenuation levels.

  17. Distributed environmental control

    NASA Technical Reports Server (NTRS)

    Cleveland, Gary A.

    1992-01-01

    We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).

  18. Properly Understanding the Impacts of Distributed Resources on Distribution Systems

    SciTech Connect

    Rizy, D Tom; Li, Fangxing; Li, Huijuan; Adhikari, Sarina; Kueck, John D

    2010-01-01

    The subject paper discusses important impacts of distributed resources on distribution networks and feeders. These include capacity, line losses, voltage regulation, and central system support (such as volt/var via central generators and substation) as the number, placement and penetration levels of distributed resources are varied. Typically, the impacts of distributed resources on the distribution system are studied by using steady-state rather than dynamic analysis tools. However, the response time and transient impacts of both system equipment (such as substation/feeder capacitors) and distributed resources needs to be taken into account and only dynamic analysis will provide the full impact results. ORNL is wrapping up a study of distributed resources interconnected to a large distribution system considering the above variables. A report of the study and its results will be condensed into a paper for this panel session. The impact of distributed resources will vary as the penetration level reaches the capacity of the distribution feeder/system. The question is how high of a penetration of distributed resource can be accommodated on the distribution feeder/system without any major changes to system operation, design and protection. The impacts most surely will vary depending upon load composition, distribution and level. Also, it is expected that various placement of distributed resources will impact the distribution system differently.

  19. Fiber distributed feedback laser

    NASA Technical Reports Server (NTRS)

    Elachi, C.; Evans, G. A.; Yeh, C. (Inventor)

    1976-01-01

    Utilizing round optical fibers as communication channels in optical communication networks presents the problem of obtaining a high efficiency coupling between the optical fiber and the laser. A laser is made an integral part of the optical fiber channel by either diffusing active material into the optical fiber or surrounding the optical fiber with the active material. Oscillation within the active medium to produce lasing action is established by grating the optical fiber so that distributed feedback occurs.

  20. Symmetric generalized binomial distributions

    SciTech Connect

    Bergeron, H.; Curado, E. M. F.; Gazeau, J. P.; Rodrigues, Ligia M. C. S. E-mail: evaldo@cbpf.br E-mail: ligia@cbpf.br

    2013-12-15

    In two recent articles, we have examined a generalization of the binomial distribution associated with a sequence of positive numbers, involving asymmetric expressions of probabilities that break the symmetry win-loss. We present in this article another generalization (always associated with a sequence of positive numbers) that preserves the symmetry win-loss. This approach is also based on generating functions and presents constraints of non-negativeness, similar to those encountered in our previous articles.

  1. Structure functions and parton distributions

    SciTech Connect

    Olness, F.; Tung, Wu-Ki

    1991-04-01

    Activities of the structure functions and parton distributions group is summarized. The impact of scheme-dependence of parton distributions (especially sea-quarks and gluons) on the quantitative formulation of the QCD parton model is highlighted. Recent progress on the global analysis of parton distributions is summarized. Issues on the proper use of the next-to-leading parton distributions are stressed.

  2. Distributed Education, Expertise, and Cognition.

    ERIC Educational Resources Information Center

    Saba, Farhad

    2000-01-01

    Discusses the term distributed education and its relationship to distance education. Topics include distributed training on corporate intranets; distributed expertise, which refers to the use of mediated communication for education and training by experts in different locations; and distributed, or situated, cognition, which would include…

  3. A distributable APSE

    NASA Technical Reports Server (NTRS)

    Taft, Tucker, S.

    1986-01-01

    A distributed Ada program library is a key element in a distributed Ada Program Support Environment (APSE). To implement this successfully, the program library universe as defined by the Ada Reference Manual must be broken up into independently manageable pieces. This in turn requires the support of a distributed database system, as well as a mechanism for identifying compilation units, linkable subprograms, and Ada types in a decentralized way, to avoid falling victim to the bottlenecks of a global database and/or global unique-identifier manager. It was found that the ability to decentralize Ada program library activity is a major advantage in the management of large Ada programs. Currently, there are 18 resource-catalog revision sets, each in its own Host Interface (HIF) partition, plus 18 partitions for testing each of these, plus 11 partitions for the top-level compiler/linker/program library manager components. Compiling and other development work can proceed in parallel in each of these partitions, without suffering the performance bottlenecks of global locks or global unique-identifier generation.

  4. The Distributed Auditory Cortex

    PubMed Central

    Winer, Jeffery A.; Lee, Charles C.

    2009-01-01

    A synthesis of cat auditory cortex (AC) organization is presented in which the extrinsic and intrinsic connections interact to derive a unified profile of the auditory stream and use it to direct and modify cortical and subcortical information flow. Thus, the thalamocortical input provides essential sensory information about peripheral stimulus events, which AC redirects locally for feature extraction, and then conveys to parallel auditory, multisensory, premotor, limbic, and cognitive centers for further analysis. The corticofugal output influences areas as remote as the pons and the cochlear nucleus, structures whose effects upon AC are entirely indirect, and has diverse roles in the transmission of information through the medial geniculate body and inferior colliculus. The distributed AC is thus construed as a functional network in which the auditory percept is assembled for subsequent redistribution in sensory, premotor, and cognitive streams contingent on the derived interpretation of the acoustic events. The confluence of auditory and multisensory streams likely precedes cognitive processing of sound. The distributed AC constitutes the largest and arguably the most complete representation of the auditory world. Many facets of this scheme may apply in rodent and primate AC as well. We propose that the distributed auditory cortex contributes to local processing regimes in regions as disparate as the frontal pole and the cochlear nucleus to construct the acoustic percept. PMID:17329049

  5. Periodicity in the spatial-temporal earthquake distributions for the Pacific region: observation and modeling.

    NASA Astrophysics Data System (ADS)

    Sasorova, Elena; Levin, Boris

    2014-05-01

    In the course of the last century a cyclic increasing and decreasing of the Earth's seismic activity (SA) was marked. The variations of the SA for the events with M>=7.0 from 1900 up to date were under study. The two subsets of the worldwide NEIC (USGS) catalog were used: USGS/NEIC from 1973 to 2012 and catalog of the significant worldwide earthquakes (2150 B.C. - 1994 A.D.), compiled by USGS/NEIC from the NOAA agency. The preliminary standardization of magnitudes and elimination of aftershocks from list of events was performed. The entire period of observations was subdivided into 5-year intervals. The temporal distributions of the earthquake (EQ) density and released energy density were calculated separately for the Southern hemisphere (SH), and for the Northern hemisphere (NH) and for eighteen latitudinal belts: 90°-80°N, 80°-70°N, 70°-60°N, 60°-50°N and so on (the size of each belt is equal to 10°). The periods of the SA was compared for different latitudinal belts of the Earth. The peaks and decays of the seismicity do not coincide in time for different latitudinal belts and especially for the belts located in NH and SH. The peaks and decays of the SA for the events (with M>=8) were marked in the temporal distributions of the EQ for all studied latitudinal belts. The two-dimension distributions (over latitudes and over time) of the EQ density and released energy density highlighted that the periods of amplification of the SA are equal to 30-35 years approximately. Next, we check the existence of a non-random component in the EQ occurrence between the NH and the SH. All events were related to the time axis according to their origin time. We take into consideration the set of the EQs in the studied catalog as the sequence of events if each event may have only one of two possible outcome (occurrence in the NH or in the SH). A nonparametric run test was used for testing of hypothesis about an existence the nonrandom component in the examined sequence of

  6. GASIFICATION FOR DISTRIBUTED GENERATION

    SciTech Connect

    Ronald C. Timpe; Michael D. Mann; Darren D. Schmidt

    2000-05-01

    A recent emphasis in gasification technology development has been directed toward reduced-scale gasifier systems for distributed generation at remote sites. The domestic distributed power generation market over the next decade is expected to be 5-6 gigawatts per year. The global increase is expected at 20 gigawatts over the next decade. The economics of gasification for distributed power generation are significantly improved when fuel transport is minimized. Until recently, gasification technology has been synonymous with coal conversion. Presently, however, interest centers on providing clean-burning fuel to remote sites that are not necessarily near coal supplies but have sufficient alternative carbonaceous material to feed a small gasifier. Gasifiers up to 50 MW are of current interest, with emphasis on those of 5-MW generating capacity. Internal combustion engines offer a more robust system for utilizing the fuel gas, while fuel cells and microturbines offer higher electric conversion efficiencies. The initial focus of this multiyear effort was on internal combustion engines and microturbines as more realistic near-term options for distributed generation. In this project, we studied emerging gasification technologies that can provide gas from regionally available feedstock as fuel to power generators under 30 MW in a distributed generation setting. Larger-scale gasification, primarily coal-fed, has been used commercially for more than 50 years to produce clean synthesis gas for the refining, chemical, and power industries. Commercial-scale gasification activities are under way at 113 sites in 22 countries in North and South America, Europe, Asia, Africa, and Australia, according to the Gasification Technologies Council. Gasification studies were carried out on alfalfa, black liquor (a high-sodium waste from the pulp industry), cow manure, and willow on the laboratory scale and on alfalfa, black liquor, and willow on the bench scale. Initial parametric tests

  7. Planning Systems for Distributed Operations

    NASA Technical Reports Server (NTRS)

    Maxwell, Theresa G.

    2002-01-01

    This viewgraph representation presents an overview of the mission planning process involving distributed operations (such as the International Space Station (ISS)) and the computer hardware and software systems needed to support such an effort. Topics considered include: evolution of distributed planning systems, ISS distributed planning, the Payload Planning System (PPS), future developments in distributed planning systems, Request Oriented Scheduling Engine (ROSE) and Next Generation distributed planning systems.

  8. Factors affecting distributed system security

    SciTech Connect

    Nessett, D.M.

    1985-11-13

    Recent work examining distributed system security requirements is critiqued. A notion of trust based on distributed system topology and distributed system node evaluation levels proposed in that work is shown to be deficient. The notion fails to make allowances for the distributed system physical security environment, security factors related to the management of distributed systems by more than one jurisdictive authority and interactions that can occur between nodes supporting different mandatory and discretionary security mechanisms.

  9. Representation of orientation distributions

    SciTech Connect

    Wenk, H.R.; Kocks, U.F.

    1985-01-01

    This paper illustrates the principles presented with a particular experimental texture: from the surface layer of a copper polycrystal cold-rolled to 60% reduction in thickness. Four incomplete pole figures (200, 220, 222, and 113) were determined by x-ray diffraction in reflection geometry. The measured pole figures nearly exhibited orthorhombic symmetry (as expected), which was then strictly enforced by averaging the four quadrants of the pole figure. The orientation distribution function was obtained using the expansion in spherical harmonics (with only even-order coefficients up to l = 18).

  10. Distributed Optimization System

    DOEpatents

    Hurtado, John E.; Dohrmann, Clark R.; Robinett, III, Rush D.

    2004-11-30

    A search system and method for controlling multiple agents to optimize an objective using distributed sensing and cooperative control. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace. The objective can be: chemical sources, temperature sources, radiation sources, light sources, evaders, trespassers, explosive sources, time dependent sources, time independent sources, function surfaces, maximization points, minimization points, and optimal control of a system such as a communication system, an economy, a crane, and a multi-processor computer.

  11. Distributed computing systems programme

    SciTech Connect

    Duce, D.

    1984-01-01

    Publication of this volume coincides with the completion of the U.K. Science and Engineering Research Council's coordinated programme of research in Distributed Computing Systems (DCS) which ran from 1977 to 1984. The volume is based on presentations made at the programme's final conference. The first chapter explains the origins and history of DCS and gives an overview of the programme and its achievements. The remaining sixteen chapters review particular research themes (including imperative and declarative languages, and performance modelling), and describe particular research projects in technical areas including local area networks, design, development and analysis of concurrent systems, parallel algorithm design, functional programming and non-von Neumann computer architectures.

  12. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  13. Vascular Distribution of Nanomaterials

    PubMed Central

    Stapleton, Phoebe A.; Nurkiewicz, Timothy R.

    2014-01-01

    Once considered primarily occupational, novel nanotechnology innovation and application has led to widespread domestic use and intentional biomedical exposures. With these exciting advances, the breadth and depth of toxicological considerations must also be expanded. The vascular system interacts with every tissue in the body, striving to homeostasis. Engineered nanomaterials (ENM) have been reported to distribute in many different organs and tissues. However, these observations have tended to use approaches requiring tissue homogenization and/or gross organ analyses. These techniques, while effective in establishing presence, preclude an exact determination of where ENM are deposited within a tissue. It is necessary to identify this exact distribution and deposition of ENM throughout the cardiovascular system, with respect to vascular hemodynamics and in vivo/ in vitro ENM modifications taken into account if nanotechnology is to achieve its full potential. Distinct levels of the vasculature will first be described as individual compartments. Then the vasculature will be considered as a whole. These unique compartments and biophysical conditions will be discussed in terms of their propensity to favor ENM deposition. Understanding levels of the vasculature will also be discussed. Ultimately, future studies must verify the mechanisms speculated on and presented herein. PMID:24777845

  14. Models of distributive justice.

    PubMed

    Wolff, Jonathan

    2007-01-01

    Philosophical disagreement about justice rages over at least two questions. The most immediate is a substantial question, concerning the conditions under which particular distributive arrangements can be said to be just or unjust. The second, deeper, question concerns the nature of justice itself. What is justice? Here we can distinguish three views. First, justice as mutual advantage sees justice as essentially a matter of the outcome of a bargain. There are times when two parties can both be better off by making some sort of agreement. Justice, on this view, concerns the distribution of the benefits and burdens of the agreement. Second, justice as reciprocity takes a different approach, looking not at bargaining but at the idea of a fair return or just price, attempting to capture the idea of justice as equal exchange. Finally justice as impartiality sees justice as 'taking the other person's point of view' asking 'how would you like it if it happened to you?' Each model has significantly different consequences for the question of when issues of justice arise and how they should be settled. It is interesting to consider whether any of these models of justice could regulate behaviour between non-human animals.

  15. Distributed Operations Planning

    NASA Technical Reports Server (NTRS)

    Fox, Jason; Norris, Jeffrey; Powell, Mark; Rabe, Kenneth; Shams, Khawaja

    2007-01-01

    Maestro software provides a secure and distributed mission planning system for long-term missions in general, and the Mars Exploration Rover Mission (MER) specifically. Maestro, the successor to the Science Activity Planner, has a heavy emphasis on portability and distributed operations, and requires no data replication or expensive hardware, instead relying on a set of services functioning on JPL institutional servers. Maestro works on most current computers with network connections, including laptops. When browsing down-link data from a spacecraft, Maestro functions similarly to being on a Web browser. After authenticating the user, it connects to a database server to query an index of data products. It then contacts a Web server to download and display the actual data products. The software also includes collaboration support based upon a highly reliable messaging system. Modifications made to targets in one instance are quickly and securely transmitted to other instances of Maestro. The back end that has been developed for Maestro could benefit many future missions by reducing the cost of centralized operations system architecture.

  16. PULSE AMPLITUDE DISTRIBUTION RECORDER

    DOEpatents

    Cowper, G.

    1958-08-12

    A device is described for automatica1ly recording pulse annplitude distribution received from a counter. The novelty of the device consists of the over-all arrangement of conventional circuit elements to provide an easy to read permanent record of the pulse amplitude distribution during a certain time period. In the device a pulse analyzer separates the pulses according to annplitude into several channels. A scaler in each channel counts the pulses and operates a pen marker positioned over a drivable recorder sheet. Since the scalers in each channel have the sanne capacity, the control circuitry permits counting of the incoming pulses until one scaler reaches capacity, whereupon the input is removed and an internal oscillator supplies the necessary pulses to fill up the other scalers. Movement of the chart sheet is initiated wben the first scaler reaches capacity to thereby give a series of marks at spacings proportional to the time required to fill the remaining scalers, and accessory equipment marks calibration points on the recorder sheet to facilitate direct reading of the number of external pulses supplied to each scaler.

  17. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  18. Distributed System Design Checklist

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin

    2014-01-01

    This report describes a design checklist targeted to fault-tolerant distributed electronic systems. Many of the questions and discussions in this checklist may be generally applicable to the development of any safety-critical system. However, the primary focus of this report covers the issues relating to distributed electronic system design. The questions that comprise this design checklist were created with the intent to stimulate system designers' thought processes in a way that hopefully helps them to establish a broader perspective from which they can assess the system's dependability and fault-tolerance mechanisms. While best effort was expended to make this checklist as comprehensive as possible, it is not (and cannot be) complete. Instead, we expect that this list of questions and the associated rationale for the questions will continue to evolve as lessons are learned and further knowledge is established. In this regard, it is our intent to post the questions of this checklist on a suitable public web-forum, such as the NASA DASHLink AFCS repository. From there, we hope that it can be updated, extended, and maintained after our initial research has been completed.

  19. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  20. Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Bodden, Lee; Pease, Phil; Bedet, Jean-Jacques; Rosen, Wayne

    1993-01-01

    The Goddard Space Flight Center Version 0 Distributed Active Archive Center (GSFC V0 DAAC) is being developed to enhance and improve scientific research and productivity by consolidating access to remote sensor earth science data in the pre-EOS time frame. In cooperation with scientists from the science labs at GSFC, other NASA facilities, universities, and other government agencies, the DAAC will support data acquisition, validation, archive and distribution. The DAAC is being developed in response to EOSDIS Project Functional Requirements as well as from requirements originating from individual science projects such as SeaWiFS, Meteor3/TOMS2, AVHRR Pathfinder, TOVS Pathfinder, and UARS. The GSFC V0 DAAC has begun operational support for the AVHRR Pathfinder (as of April, 1993), TOVS Pathfinder (as of July, 1993) and the UARS (September, 1993) Projects, and is preparing to provide operational support for SeaWiFS (August, 1994) data. The GSFC V0 DAAC has also incorporated the existing data, services, and functionality of the DAAC/Climate, DAAC/Land, and the Coastal Zone Color Scanner (CZCS) Systems.

  1. Distribution and moments of radial error. [Rayleigh distribution - random variables

    NASA Technical Reports Server (NTRS)

    White, R. G.

    1975-01-01

    An investigation of the moments and probability distribution of the resultant of two normally distributed random variables is presented. This is the so-called generalized Rayleigh distribution which has many applications in the study of wind shear, random noise, and radar. The most general formula was derived, and two special cases were considered for which tables of the moments and probability distribution functions are included as an appendix. One of the special cases was generalized to n-dimensions.

  2. Distributed Wind Energy in Idaho

    SciTech Connect

    Gardner, John; Johnson, Kathryn; Haynes, Todd; Seifert, Gary

    2009-01-31

    This project is a research and development program aimed at furthering distributed wind technology. In particular, this project addresses some of the barriers to distributed wind energy utilization in Idaho.

  3. Distributed charging of electrical assets

    DOEpatents

    Ghosh, Soumyadip; Phan, Dung; Sharma, Mayank; Wu, Chai Wah; Xiong, Jinjun

    2016-02-16

    The present disclosure relates generally to the field of distributed charging of electrical assets. In various examples, distributed charging of electrical assets may be implemented in the form of systems, methods and/or algorithms.

  4. Distribution of autumn-staging Lesser Snow Geese on the northeast coastal plain of Alaska [Distribución de chen caerulescens a través de su congregación otonal

    USGS Publications Warehouse

    Robertson, Donna G.; Brackney, Alan W.; Spindler, Michael A.; Hupp, Jerry W.

    1997-01-01

    We conducted aerial surveys of Lesser Snow Geese (Chen caerulescens caerulescens) during autumn staging on the coastal plain of the Arctic National Wildlife Refuge (ANWR) in northeast Alaska from late August through September, 1982 - 1993. We evaluated numbers and distribution of Snow Geese that staged on the ANWR, compared abundance of birds among 5 x 5-km cells used frequently (5 - 8 yr), periodically (3 - 4 yr), or infrequently (1 - 2 yr), and examined distribution changes within years. Maximum numbers of Snow Geese observed annually were highly variable (range 12,828 - 309,225). Snow Goose flocks occurred across 605,000 ha of the coastal plain, but used some areas more frequently than others. Frequently used cells (38 of 363 cells in the study area) were non-randomly distributed and primarily occurred on the central coastal plain between the wet coastal and steep foothills regions. Abundance of geese was greatest in frequently used, intermediate in periodically used, and lowest in infrequently used cells. Within years, Snow Goose numbers and flock locations varied between surveys, possibly because geese moved to different foraging areas during staging. The widespread distribution and annual variability in numbers of Snow Geese on the coastal plain was likely because birds used foraging habitats that were spatially and temporally heterogeneous. The ANWR coastal plain is an important component of the fall-staging area used by Snow Geese that nest in the western Canadian Arctic. Management decisions that affect the region should reflect its value to migrating Snow Geese.

  5. Distributed feedback lasers

    NASA Technical Reports Server (NTRS)

    Ladany, I.; Andrews, J. T.; Evans, G. A.

    1988-01-01

    A ridge waveguide distributed feedback laser was developed in InGaAsP. These devices have demonstrated CW output powers over 7 mW with threshold currents as low as 60 mA at 25 C. Measurements of the frequency response of these devices show a 3 dB bandwidth of about 2 GHz, which may be limited by the mount. The best devices have a single mode spectra over the entire temperature range tested with a side mode suppression of about 20 dB in both CW and pulsed modes. The design of this device, including detailed modeling of the ridge guide structure, effective index calculations, and a discussion of the grating configuration are presented. Also, the fabrication of the devices is presented in some detail, especially the fabrication of and subsequent growth over the grating. In addition, a high frequency fiber pigtailed package was designed and tested, which is a suitable prototype for a commercial package.

  6. DISTRIBUTED AMPLIFIER INCORPORATING FEEDBACK

    DOEpatents

    Bell, P.R. Jr.

    1958-10-21

    An improved distributed amplifier system employing feedback for stabilization is presented. In accordance with the disclosed invention, a signal to be amplified is applled to one end of a suitable terminated grid transmission line. At intervals along the transmission line, the signal is fed to stable, resistance-capacitance coupled amplifiers incorporating feedback loops therein. The output current from each amplifier is passed through an additional tube to minimize the electrostatic capacitance between the tube elements of the last stage of the amplifier, and fed to appropriate points on an output transmission line, similar to the grid line, but terminated at the opposite (input) end. The output taken from the unterminated end of the plate transmission line is proportional to the input voltage impressed upon the grid line.

  7. Nuclear Parton Distribution Functions

    SciTech Connect

    I. Schienbein, J.Y. Yu, C. Keppel, J.G. Morfin, F. Olness, J.F. Owens

    2009-06-01

    We study nuclear effects of charged current deep inelastic neutrino-iron scattering in the framework of a {chi}{sup 2} analysis of parton distribution functions (PDFs). We extract a set of iron PDFs which are used to compute x{sub Bj}-dependent and Q{sup 2}-dependent nuclear correction factors for iron structure functions which are required in global analyses of free nucleon PDFs. We compare our results with nuclear correction factors from neutrino-nucleus scattering models and correction factors for charged-lepton--iron scattering. We find that, except for very high x{sub Bj}, our correction factors differ in both shape and magnitude from the correction factors of the models and charged-lepton scattering.

  8. Carotenoid Distribution in Nature.

    PubMed

    Alcaíno, Jennifer; Baeza, Marcelo; Cifuentes, Víctor

    2016-01-01

    Carotenoids are naturally occurring red, orange and yellow pigments that are synthesized by plants and some microorganisms and fulfill many important physiological functions. This chapter describes the distribution of carotenoid in microorganisms, including bacteria, archaea, microalgae, filamentous fungi and yeasts. We will also focus on their functional aspects and applications, such as their nutritional value, their benefits for human and animal health and their potential protection against free radicals. The central metabolic pathway leading to the synthesis of carotenoids is described as the three following principal steps: (i) the synthesis of isopentenyl pyrophosphate and the formation of dimethylallyl pyrophosphate, (ii) the synthesis of geranylgeranyl pyrophosphate and (iii) the synthesis of carotenoids per se, highlighting the differences that have been found in several carotenogenic organisms and providing an evolutionary perspective. Finally, as an example, the synthesis of the xanthophyll astaxanthin is discussed. PMID:27485217

  9. Protocols for distributive scheduling

    NASA Technical Reports Server (NTRS)

    Richards, Stephen F.; Fox, Barry

    1993-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of space shuttle mission planning.

  10. Distributed road assessment system

    DOEpatents

    Beer, N. Reginald; Paglieroni, David W

    2014-03-25

    A system that detects damage on or below the surface of a paved structure or pavement is provided. A distributed road assessment system includes road assessment pods and a road assessment server. Each road assessment pod includes a ground-penetrating radar antenna array and a detection system that detects road damage from the return signals as the vehicle on which the pod is mounted travels down a road. Each road assessment pod transmits to the road assessment server occurrence information describing each occurrence of road damage that is newly detected on a current scan of a road. The road assessment server maintains a road damage database of occurrence information describing the previously detected occurrences of road damage. After the road assessment server receives occurrence information for newly detected occurrences of road damage for a portion of a road, the road assessment server determines which newly detected occurrences correspond to which previously detected occurrences of road damage.

  11. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  12. Spatially Distributed Cell Signalling

    PubMed Central

    Kholodenko, Boris N.

    2009-01-01

    Emerging evidence indicates that complex spatial gradients and (micro)domains of signalling activities arise from distinct cellular localization of opposing enzymes, such as a kinase and phosphatase, in signal transduction cascades. Often, an interacting, active form of a target protein has a lower diffusivity than an inactive form, and this leads to spatial gradients of the protein abundance in the cytoplasm. A spatially distributed signalling cascade can create step-like activation profiles, which decay at successive distances from the cell surface, assigning digital positional information to different regions in the cell. Feedback and feedforward network motifs control activity patterns, allowing signalling networks to serve as cellular devices for spatial computations. PMID:19800332

  13. Sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Sparse distributed memory was proposed be Pentti Kanerva as a realizable architecture that could store large patterns and retrieve them based on partial matches with patterns representing current sensory inputs. This memory exhibits behaviors, both in theory and in experiment, that resemble those previously unapproached by machines - e.g., rapid recognition of faces or odors, discovery of new connections between seemingly unrelated ideas, continuation of a sequence of events when given a cue from the middle, knowing that one doesn't know, or getting stuck with an answer on the tip of one's tongue. These behaviors are now within reach of machines that can be incorporated into the computing systems of robots capable of seeing, talking, and manipulating. Kanerva's theory is a break with the Western rationalistic tradition, allowing a new interpretation of learning and cognition that respects biology and the mysteries of individual human beings.

  14. Mediated semiquantum key distribution

    NASA Astrophysics Data System (ADS)

    Krawec, Walter O.

    2015-03-01

    In this work, we design a quantum key distribution protocol, allowing two limited semiquantum or "classical" users to establish a shared secret key with the help of a fully quantum server. A semiquantum user can prepare and measure qubits only in the computational basis and so must rely on this quantum server to produce qubits in alternative bases and also to perform alternative measurements. However, we assume that the server is untrusted and we prove the unconditional security of our protocol even in the worst case: when this quantum server is an all-powerful adversary. We also compute a lower bound of the key rate of our protocol, in the asymptotic scenario, as a function of the observed error rate in the channel, allowing us to compute the maximally tolerated error of our protocol. Our results show that a semiquantum protocol may hold similar security to a fully quantum one.

  15. CMCC Data Distribution Centre

    NASA Astrophysics Data System (ADS)

    Aloisio, Giovanni; Fiore, Sandro; Negro, A.

    2010-05-01

    The CMCC Data Distribution Centre (DDC) is the primary entry point (web gateway) to the CMCC. It is a Data Grid Portal providing a ubiquitous and pervasive way to ease data publishing, climate metadata search, datasets discovery, metadata annotation, data access, data aggregation, sub-setting, etc. The grid portal security model includes the use of HTTPS protocol for secure communication with the client (based on X509v3 certificates that must be loaded into the browser) and secure cookies to establish and maintain user sessions. The CMCC DDC is now in a pre-production phase and it is currently used only by internal users (CMCC researchers and climate scientists). The most important component already available in the CMCC DDC is the Search Engine which allows users to perform, through web interfaces, distributed search and discovery activities by introducing one or more of the following search criteria: horizontal extent (which can be specified by interacting with a geographic map), vertical extent, temporal extent, keywords, topics, creation date, etc. By means of this page the user submits the first step of the query process on the metadata DB, then, she can choose one or more datasets retrieving and displaying the complete XML metadata description (from the browser). This way, the second step of the query process is carried out by accessing to a specific XML document of the metadata DB. Finally, through the web interface, the user can access to and download (partially or totally) the data stored on the storage device accessing to OPeNDAP servers and to other available grid storage interfaces. Requests concerning datasets stored in deep storage will be served asynchronously.

  16. Hail Size Distribution Mapping

    NASA Technical Reports Server (NTRS)

    2008-01-01

    A 3-D weather radar visualization software program was developed and implemented as part of an experimental Launch Pad 39 Hail Monitor System. 3DRadPlot, a radar plotting program, is one of several software modules that form building blocks of the hail data processing and analysis system (the complete software processing system under development). The spatial and temporal mapping algorithms were originally developed through research at the University of Central Florida, funded by NASA s Tropical Rainfall Measurement Mission (TRMM), where the goal was to merge National Weather Service (NWS) Next-Generation Weather Radar (NEXRAD) volume reflectivity data with drop size distribution data acquired from a cluster of raindrop disdrometers. In this current work, we adapted these algorithms to process data from a cluster of hail disdrometers positioned around Launch Pads 39A or 39B, along with the corresponding NWS radar data. Radar data from all NWS NEXRAD sites is archived at the National Climatic Data Center (NCDC). That data can be readily accessed at . 3DRadPlot plots Level III reflectivity data at four scan elevations (this software is available at Open Channel Software, ). By using spatial and temporal interpolation/extrapolation based on hydrometeor fall dynamics, we can merge the hail disdrometer array data coupled with local Weather Surveillance Radar-1988, Doppler (WSR-88D) radial velocity and reflectivity data into a 4-D (3-D space and time) picture of hail size distributions. Hail flux maps can then be generated and used for damage prediction and assessment over specific surfaces corresponding to structures within the disdrometer array volume. Immediately following a hail storm, specific damage areas and degree of damage can be identified for inspection crews.

  17. The Saguaro distributed operating system

    NASA Astrophysics Data System (ADS)

    Andrews, Gregory R.; Schlichting, Richard D.

    1989-05-01

    The progress achieved over the final year of the Saguaro distributed operating system project is presented. The primary achievements were in related research, including SR distributed programming language, the MLP system for constructing distributed mixed-language programs, the Psync interprocess communication mechanism, a configurable operating system kernal called the x-kernal, and the development of language mechanisms for performing failure handling in distributed programming languages.

  18. Proteins linked to autosomal dominant and autosomal recessive disorders harbor characteristic rare missense mutation distribution patterns.

    PubMed

    Turner, Tychele N; Douville, Christopher; Kim, Dewey; Stenson, Peter D; Cooper, David N; Chakravarti, Aravinda; Karchin, Rachel

    2015-11-01

    The role of rare missense variants in disease causation remains difficult to interpret. We explore whether the clustering pattern of rare missense variants (MAF < 0.01) in a protein is associated with mode of inheritance. Mutations in genes associated with autosomal dominant (AD) conditions are known to result in either loss or gain of function, whereas mutations in genes associated with autosomal recessive (AR) conditions invariably result in loss-of-function. Loss-of-function mutations tend to be distributed uniformly along protein sequence, whereas gain-of-function mutations tend to localize to key regions. It has not previously been ascertained whether these patterns hold in general for rare missense mutations. We consider the extent to which rare missense variants are located within annotated protein domains and whether they form clusters, using a new unbiased method called CLUstering by Mutation Position. These approaches quantified a significant difference in clustering between AD and AR diseases. Proteins linked to AD diseases exhibited more clustering of rare missense mutations than those linked to AR diseases (Wilcoxon P = 5.7 × 10(-4), permutation P = 8.4 × 10(-4)). Rare missense mutation in proteins linked to either AD or AR diseases was more clustered than controls (1000G) (Wilcoxon P = 2.8 × 10(-15) for AD and P = 4.5 × 10(-4) for AR, permutation P = 3.1 × 10(-12) for AD and P = 0.03 for AR). The differences in clustering patterns persisted even after removal of the most prominent genes. Testing for such non-random patterns may reveal novel aspects of disease etiology in large sample studies. PMID:26246501

  19. The distribution and genetic structure of Escherichia coli in Australian vertebrates: host and geographic effects.

    PubMed

    Gordon, David M; Cowling, Ann

    2003-12-01

    Escherichia coli was isolated from more than 2300 non-domesticated vertebrate hosts living in Australia. E. coli was most prevalent in mammals, less prevalent in birds and uncommon in fish, frogs and reptiles. Mammals were unlikely to harbour E. coli if they lived in regions with a desert climate and less likely to have E. coli if they lived in the tropics than if they lived in semi-arid or temperate regions. In mammals, the likelihood of isolating E. coli from an individual depended on the diet of the host and E. coli was less prevalent in carnivores than in herbivores or omnivores. In both birds and mammals, the probability of isolating E. coli increased with the body mass of the host. Hosts living in close proximity to human habitation were more likely to harbour E. coli than hosts living away from people. The relative abundance of E. coli groups A, B1, B2 and D strains in mammals depended on climate, host diet and body mass. Group A strains were uncommon, but were isolated from both ectothermic and endothermic vertebrates. Group B1 strains could also be isolated from any vertebrate group, but were predominant in ectothermic vertebrates, birds and carnivorous mammals. Group B2 strains were unlikely to be isolated from ectotherms and were most abundant in omnivorous and herbivorous mammals. Group D strains were rare in ectotherms and uncommon in endotherms, but were equally abundant in birds and mammals. The results of this study suggest that, at the species level, the ecological niche of E. coli is mammals with hindgut modifications to enable microbial fermentation, or in the absence of a modified hindgut, E. coli can only establish a population in 'large-bodied' hosts. The non-random distribution of E. coli genotypes among the different host groups indicates that strains of the four E. coli groups may differ in their ecological niches and life-history characteristics.

  20. Distributed Simulation for Space Exploration

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.

    2006-01-01

    This viewgraph presentation reviews the use of simulation and modeling in preparation for the planned exploration initiatives. The Exploration Systems Mission Directorate (EMSD) Integrated Modeling and Simulation (IM&S) team strategy encompasses a wide spectrum of simulation and modeling policies and technologies. One prominent technology is distributed simulation. The DIstributed Simulation (DIS),a collaborative simulation project with international participation (US and Japan) is reviewed as an example of distributed simulation development. The Distributed Space Exploration Simulation (DSES) is another example of distributed simulation that is described

  1. Correction of Distributed Optical Aberrations

    SciTech Connect

    Baker, K; Olivier, S; Carrano, C; Phillion, D

    2006-02-12

    The objective of this project was to demonstrate the use of multiple distributed deformable mirrors (DMs) to improve the performance of optical systems with distributed aberrations. This concept is expected to provide dramatic improvement in the optical performance of systems in applications where the aberrations are distributed along the optical path or within the instrument itself. Our approach used multiple actuated DMs distributed to match the aberration distribution. The project developed the algorithms necessary to determine the required corrections and simulate the performance of these multiple DM systems.

  2. Distributed visualization framework architecture

    NASA Astrophysics Data System (ADS)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this

  3. Vacillation Made Easy: Distribution, Re-distribution, and Un-distribution of DOPL-based Processing

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    1993-01-01

    Distributed Objects Protocol Layer, or DOPL, Provides a simple and general data communication abstraction that can support the distribution of C++ applications software functionality among an arbitrary collection of processors. The purposed of the abstraction is to minimize the cost of revising processing distribution decisions throughout the software development cycle, including after software has beed delivered to users.

  4. Distributed ultrafast fibre laser

    PubMed Central

    Liu, Xueming; Cui, Yudong; Han, Dongdong; Yao, Xiankun; Sun, Zhipei

    2015-01-01

    A traditional ultrafast fibre laser has a constant cavity length that is independent of the pulse wavelength. The investigation of distributed ultrafast (DUF) lasers is conceptually and technically challenging and of great interest because the laser cavity length and fundamental cavity frequency are changeable based on the wavelength. Here, we propose and demonstrate a DUF fibre laser based on a linearly chirped fibre Bragg grating, where the total cavity length is linearly changeable as a function of the pulse wavelength. The spectral sidebands in DUF lasers are enhanced greatly, including the continuous-wave (CW) and pulse components. We observe that all sidebands of the pulse experience the same round-trip time although they have different round-trip distances and refractive indices. The pulse-shaping of the DUF laser is dominated by the dissipative processes in addition to the phase modulations, which makes our ultrafast laser simple and stable. This laser provides a simple, stable, low-cost, ultrafast-pulsed source with controllable and changeable cavity frequency. PMID:25765454

  5. Intraplacental retinol distribution.

    PubMed

    Saunders, Cláudia; Leal, Maria Do Carmo; Flores, Hernando; Soares, Alexandre Gonçalves; De Lima, Ana Paula Pereira Thiapó; Leite, Paula Costa; Gomes, Mirian Martins; De Souza Júnior, Paulo Roberto Borges; Ramalho, Rejane Andréa

    2005-12-01

    With the objective of evaluating intraplacental vitamin A distribution, 234 placental samples were collected, corresponding to six samples from each of the placentas analyzed: two from the lateral maternal portion, one from the central maternal portion, two from the lateral fetal portion, and one from the central fetal portion. Samples were obtained from 39 adult puerperal mothers with low-risk pregnancies, without vitamin A deficiency or night blindness. Retinol content determination was achieved through spectrophotometry. Retinol values obtained for each region were correlated with the most probable value for each placenta (P < 0.001). Despite differences in retinol content between samples, statistical data analysis showed that intra-tissue variation had no influence on the conversion of data into information. Consequently, any portion of the placenta may be used for retinol level determination purposes, due to the correlation between all portions and the most probable value. The findings of the present study represent an advance for surveys intending to incorporate the collection and dosage of placental vitamin A levels into their analyses, thus increasing the arsenal of pre-pathological or subclinical vitamin A deficiency markers, which can allow for earlier intervention on the maternal-infant group. PMID:16638665

  6. Distributed Deliberative Recommender Systems

    NASA Astrophysics Data System (ADS)

    Recio-García, Juan A.; Díaz-Agudo, Belén; González-Sanz, Sergio; Sanchez, Lara Quijano

    Case-Based Reasoning (CBR) is one of most successful applied AI technologies of recent years. Although many CBR systems reason locally on a previous experience base to solve new problems, in this paper we focus on distributed retrieval processes working on a network of collaborating CBR systems. In such systems, each node in a network of CBR agents collaborates, arguments and counterarguments its local results with other nodes to improve the performance of the system's global response. We describe D2ISCO: a framework to design and implement deliberative and collaborative CBR systems that is integrated as a part of jcolibritwo an established framework in the CBR community. We apply D2ISCO to one particular simplified type of CBR systems: recommender systems. We perform a first case study for a collaborative music recommender system and present the results of an experiment of the accuracy of the system results using a fuzzy version of the argumentation system AMAL and a network topology based on a social network. Besides individual recommendation we also discuss how D2ISCO can be used to improve recommendations to groups and we present a second case of study based on the movie recommendation domain with heterogeneous groups according to the group personality composition and a group topology based on a social network.

  7. Data distribution satellite

    NASA Technical Reports Server (NTRS)

    Price, Kent M.; Jorasch, Ronald E.; Wiskerchen, Michael J.

    1991-01-01

    A description is given of a data distribution satellite (DDS) system. The DDS would operate in conjunction with the tracking and data relay satellite system to give ground-based users real time, two-way access to instruments in space and space-gathered data. The scope of work includes the following: (1) user requirements are derived; (2) communication scenarios are synthesized; (3) system design constraints and projected technology availability are identified; (4) DDS communications payload configuration is derived, and the satellite is designed; (5) requirements for earth terminals and network control are given; (6) system costs are estimated, both life cycle costs and user fees; and (7) technology developments are recommended, and a technology development plan is given. The most important results obtained are as follows: (1) a satellite designed for launch in 2007 is feasible and has 10 Gb/s capacity, 5.5 kW power, and 2000 kg mass; (2) DDS features include on-board baseband switching, use of Ku- and Ka-bands, multiple optical intersatellite links; and (3) system user costs are competitive with projected terrestrial communication costs.

  8. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  9. Distributed Merge Trees

    SciTech Connect

    Morozov, Dmitriy; Weber, Gunther

    2013-01-08

    Improved simulations and sensors are producing datasets whose increasing complexity exhausts our ability to visualize and comprehend them directly. To cope with this problem, we can detect and extract significant features in the data and use them as the basis for subsequent analysis. Topological methods are valuable in this context because they provide robust and general feature definitions. As the growth of serial computational power has stalled, data analysis is becoming increasingly dependent on massively parallel machines. To satisfy the computational demand created by complex datasets, algorithms need to effectively utilize these computer architectures. The main strength of topological methods, their emphasis on global information, turns into an obstacle during parallelization. We present two approaches to alleviate this problem. We develop a distributed representation of the merge tree that avoids computing the global tree on a single processor and lets us parallelize subsequent queries. To account for the increasing number of cores per processor, we develop a new data structure that lets us take advantage of multiple shared-memory cores to parallelize the work on a single node. Finally, we present experiments that illustrate the strengths of our approach as well as help identify future challenges.

  10. Data distribution satellite

    NASA Technical Reports Server (NTRS)

    Stevens, Grady H.

    1992-01-01

    The Data Distribution Satellite (DDS), operating in conjunction with the planned space network, the National Research and Education Network and its commercial derivatives, would play a key role in networking the emerging supercomputing facilities, national archives, academic, industrial, and government institutions. Centrally located over the United States in geostationary orbit, DDS would carry sophisticated on-board switching and make use of advanced antennas to provide an array of special services. Institutions needing continuous high data rate service would be networked together by use of a microwave switching matrix and electronically steered hopping beams. Simultaneously, DDS would use other beams and on board processing to interconnect other institutions with lesser, low rate, intermittent needs. Dedicated links to White Sands and other facilities would enable direct access to space payloads and sensor data. Intersatellite links to a second generation ATDRS, called Advanced Space Data Acquisition and Communications System (ASDACS), would eliminate one satellite hop and enhance controllability of experimental payloads by reducing path delay. Similarly, direct access would be available to the supercomputing facilities and national data archives. Economies with DDS would be derived from its ability to switch high rate facilities amongst users needed. At the same time, having a CONUS view, DDS would interconnect with any institution regardless of how remote. Whether one needed high rate service or low rate service would be immaterial. With the capability to assign resources on demand, DDS will need only carry a portion of the resources needed if dedicated facilities were used. Efficiently switching resources to users as needed, DDS would become a very feasible spacecraft, even though it would tie together the space network, the terrestrial network, remote sites, 1000's of small users, and those few who need very large data links intermittently.

  11. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    Conroy, Michael; Mazzone, Rebecca; Little, William; Elfrey, Priscilla; Mann, David; Mabie, Kevin; Cuddy, Thomas; Loundermon, Mario; Spiker, Stephen; McArthur, Frank; Srey, Tate; Bonilla, Dennis

    2010-01-01

    The Distributed Observer network (DON) is a NASA-collaborative environment that leverages game technology to bring three-dimensional simulations to conventional desktop and laptop computers in order to allow teams of engineers working on design and operations, either individually or in groups, to view and collaborate on 3D representations of data generated by authoritative tools such as Delmia Envision, Pro/Engineer, or Maya. The DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3D visual environment. DON has been designed to enhance accessibility and user ability to observe and analyze visual simulations in real time. A variety of NASA mission segment simulations [Synergistic Engineering Environment (SEE) data, NASA Enterprise Visualization Analysis (NEVA) ground processing simulations, the DSS simulation for lunar operations, and the Johnson Space Center (JSC) TRICK tool for guidance, navigation, and control analysis] were experimented with. Desired functionalities, [i.e. Tivo-like functions, the capability to communicate textually or via Voice-over-Internet Protocol (VoIP) among team members, and the ability to write and save notes to be accessed later] were targeted. The resulting DON application was slated for early 2008 release to support simulation use for the Constellation Program and its teams. Those using the DON connect through a client that runs on their PC or Mac. This enables them to observe and analyze the simulation data as their schedule allows, and to review it as frequently as desired. DON team members can move freely within the virtual world. Preset camera points can be established, enabling team members to jump to specific views. This improves opportunities for shared analysis of options, design reviews, tests, operations, training, and evaluations, and improves prospects for verification of requirements, issues, and approaches among dispersed teams.

  12. Distribution of tsunami interevent times

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2008-01-01

    The distribution of tsunami interevent times is analyzed using global and site-specific (Hilo, Hawaii) tsunami catalogs. An empirical probability density distribution is determined by binning the observed interevent times during a period in which the observation rate is approximately constant. The empirical distributions for both catalogs exhibit non-Poissonian behavior in which there is an abundance of short interevent times compared to an exponential distribution. Two types of statistical distributions are used to model this clustering behavior: (1) long-term clustering described by a universal scaling law, and (2) Omori law decay of aftershocks and triggered sources. The empirical and theoretical distributions all imply an increased hazard rate after a tsunami, followed by a gradual decrease with time approaching a constant hazard rate. Examination of tsunami sources suggests that many of the short interevent times are caused by triggered earthquakes, though the triggered events are not necessarily on the same fault.

  13. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  14. Off-forward parton distribution

    SciTech Connect

    Ji, X.

    1998-12-01

    Recent developments in studying off-forward parton distributions are discussed. In this talk, the author discusses the recent developments in studying the off-forward parton distributions (OFPD`s). He has written a topical review article on the subject, which will soon be published in Journal of Physics G. The interested audience can consult that article for details. His talk consists of three parts: definition of the new distributions, their physical significance, and experimental measurements.

  15. Distribution of Clokey's Eggvetch

    SciTech Connect

    David C. Anderson

    1998-12-01

    monophylla), Utah juniper (Juniperus osteosperma), and big sagebrush (Artemisia tridentata ssp. tridentata). Overall, the populations of Clokey's eggvetch on the NTS appear to be vigorous and do not appear threatened. It is estimated that there are approximately 2300 plants on the NTS. It should be considered as a species of concern because of its localized distribution, but it does not appear to warrant protection under the ESA.

  16. Space platform utilities distribution study

    NASA Technical Reports Server (NTRS)

    Lefever, A. E.

    1980-01-01

    Generic concepts for the installation of power data and thermal fluid distribution lines on large space platforms were discussed. Connections with central utility subsystem modules and pallet interfaces were also considered. Three system concept study platforms were used as basepoints for the detail development. The tradeoff of high voltage low voltage power distribution and the impact of fiber optics as a data distribution mechanism were analyzed. Thermal expansion and temperature control of utility lines and ducts were considered. Technology developments required for implementation of the generic distribution concepts were identified.

  17. A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact although a large body of empirical work shows that this is often not the case in nature. Models have been developed to model many non-random mating phenome...

  18. Reduplication and Distributivity in Kannada

    ERIC Educational Resources Information Center

    Anderson, Janet Katherine

    2012-01-01

    Reduplication of numerals and pronouns in Kannada is shown to be subject to locality conditions similar to those constraining binding. This dissertation explores an account of distributivity which exploits the similarity to binding, arguing that the source of the distributive reading in Numeral Reduplication is a bound element. [The dissertation…

  19. Algorithm Calculates Cumulative Poisson Distribution

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Nolty, Robert C.; Scheuer, Ernest M.

    1992-01-01

    Algorithm calculates accurate values of cumulative Poisson distribution under conditions where other algorithms fail because numbers are so small (underflow) or so large (overflow) that computer cannot process them. Factors inserted temporarily to prevent underflow and overflow. Implemented in CUMPOIS computer program described in "Cumulative Poisson Distribution Program" (NPO-17714).

  20. Water Treatment Technology - Distribution Systems.

    ERIC Educational Resources Information Center

    Ross-Harrington, Melinda; Kincaid, G. David

    One of twelve water treatment technology units, this student manual on distribution systems provides instructional materials for six competencies. (The twelve units are designed for a continuing education training course for public water supply operators.) The competencies focus on the following areas: types of pipe for distribution systems, types…

  1. Quality monitored distributed voting system

    DOEpatents

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  2. ANN - based distribution system reconfiguration

    SciTech Connect

    Momoh, J.A.; Wang, Yanchun; Rizy, D.T.

    1997-08-01

    This paper describes an Artificial Neural Network (ANN) - based distribution system reconfiguration scheme to reduce system loss. The ANN is trained for different load levels and different network topologies. The proposed scheme has been tested using a 38 - bus distribution system. The results are very promising.

  3. Metric-Free Distributional Comparisons.

    ERIC Educational Resources Information Center

    Haertel, Edward H.; And Others

    Two methods are presented for comparing distributions, such as achievement test score distributions, for distinctly different groups of persons in such a way that the comparison will not be influenced by the particular metric of the test being used. Both methods use percentile scores. One method, attributed to Flanagan, fits a straight line to the…

  4. Current Perspectives in Distributive Education.

    ERIC Educational Resources Information Center

    Klaurens, Mary K., Ed.; Trapnell, Gail, Ed.

    The volume on current perspectives in distributive education contains 29 individually authored articles organized into three sections. The first section on program conceptualization deals with the following subjects: the evolution of distributive education, program planning, advisory committees, placement services, postsecondary distributive…

  5. Informetric Distributions: A Tutorial Review.

    ERIC Educational Resources Information Center

    Rousseau, Ronald; Rousseau, Sandra

    1993-01-01

    Reviews informetric, or bibliometric, distributions, including Lotka's, rank frequency distributions, Zipf functions, Mandelbrot functions, Leimkuhler functions, and Bradford's formulations. An example of the use of these techniques to analyze song texts of Thomas Dolby is given, and results are reported that show a fit with a Leimkuhler function.…

  6. Workload Distribution among Agriculture Teachers

    ERIC Educational Resources Information Center

    Torres, Robert M.; Ulmer, Jonathan D.; Aschenbrener, Mollie S.

    2008-01-01

    Teachers distribute their time in many ways. The study sought to determine how agriculture teachers distribute their time among 11 selected teacher activities (i.e., preparation for instruction; classroom/laboratory teaching; laboratory preparation and/or maintenance; grading/scoring students' work; administrative duties-program management;…

  7. Distributed Interactive Intelligent Tutoring Simulation.

    ERIC Educational Resources Information Center

    Leddo, John; Kolodziej, James

    A Distributed Interactive Intelligent Tutoring Simulation (DIITS) has been developed to train Army Infantry squad and fire team leaders skills to perform military operations cooperatively in urban terrain. It integrates distributed interactive simulation (DIS) and intelligent tutoring systems (ITSs) and thus capitalizes on the strengths of both:…

  8. Flow Distribution in Hydraulic Systems

    NASA Technical Reports Server (NTRS)

    Nguyen, S. N.

    1983-01-01

    General Flow Distribution Program analyzes pressure drops and flow distribution in closed and open hydraulic systems. Analyzes system on basis of incompressible flow though system may contain either compressible or incompressible fluid. Program solves fixed or variable flow problems for series, parallel, or series/parallel systems.

  9. Quality monitored distributed voting system

    DOEpatents

    Skogmo, D.

    1997-03-18

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.

  10. Distributed Leadership: Friend or Foe?

    ERIC Educational Resources Information Center

    Harris, Alma

    2013-01-01

    Distributed leadership is now widely known and variously enacted in schools and school systems. Distributed leadership implies a fundamental re-conceptualisation of leadership as practice and challenges conventional wisdom about the relationship between formal leadership and organisational performance. There has been much debate, speculation and…

  11. Radial distribution function in polymers

    NASA Astrophysics Data System (ADS)

    Przygocki, Wladyslaw

    1997-02-01

    Radial distribution function is a very useful tool for determination of the polymer structure. The connection between the scattered X-ray intensity and radial distribution function is presented. Some examples of RDF for polyethylene and for poly(ethylene terephtalate).

  12. Leadership in Partially Distributed Teams

    ERIC Educational Resources Information Center

    Plotnick, Linda

    2009-01-01

    Inter-organizational collaboration is becoming more common. When organizations collaborate they often do so in partially distributed teams (PDTs). A PDT is a hybrid team that has at least one collocated subteam and at least two subteams that are geographically distributed and communicate primarily through electronic media. While PDTs share many…

  13. The Future of Distributed Leadership

    ERIC Educational Resources Information Center

    Gronn, Peter

    2008-01-01

    Purpose: This paper aims to assess the empirical utility and conceptual significance of distributed leadership. Design/methodology/approach: Three main sources of evidence are drawn on. The paper reviews some neglected commentary of an early generation of distributed leadership theorists. It also discusses a strand of social science writings on…

  14. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2013-10-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as Cycle 20 proposal 13128.

  15. 2014 Distributed Wind Market Report

    SciTech Connect

    Orell, A.; Foster, N.

    2015-08-01

    The cover of the 2014 Distributed Wind Market Report.According to the 2014 Distributed Wind Market Report, distributed wind reached a cumulative capacity of almost 1 GW (906 MW) in the United States in 2014, reflecting nearly 74,000 wind turbines deployed across all 50 states, Puerto Rico, and the U.S. Virgin Islands. In total, 63.6 MW of new distributed wind capacity was added in 2014, representing nearly 1,700 units and $170 million in investment across 24 states. In 2014, America's distributed wind energy industry supported a growing domestic industrial base as exports from United States-based small wind turbine manufacturers accounted for nearly 80% of United States-based manufacturers' sales.

  16. Size distribution of ring polymers

    NASA Astrophysics Data System (ADS)

    Medalion, Shlomi; Aghion, Erez; Meirovitch, Hagai; Barkai, Eli; Kessler, David A.

    2016-06-01

    We present an exact solution for the distribution of sample averaged monomer to monomer distance of ring polymers. For non-interacting and local-interaction models these distributions correspond to the distribution of the area under the reflected Bessel bridge and the Bessel excursion respectively, and are shown to be identical in dimension d ≥ 2, albeit with pronounced finite size effects at the critical dimension, d = 2. A symmetry of the problem reveals that dimension d and 4 - d are equivalent, thus the celebrated Airy distribution describing the areal distribution of the d = 1 Brownian excursion describes also a polymer in three dimensions. For a self-avoiding polymer in dimension d we find numerically that the fluctuations of the scaled averaged distance are nearly identical in dimension d = 2, 3 and are well described to a first approximation by the non-interacting excursion model in dimension 5.

  17. Distribution System Voltage Regulation by Distributed Energy Resources

    SciTech Connect

    Ceylan, Oguzhan; Liu, Guodong; Xu, Yan; Tomsovic, Kevin

    2014-01-01

    This paper proposes a control method to regulate voltages in 3 phase unbalanced electrical distribution systems. A constrained optimization problem to minimize voltage deviations and maximize distributed energy resource (DER) active power output is solved by harmony search algorithm. IEEE 13 Bus Distribution Test System was modified to test three different cases: a) only voltage regulator controlled system b) only DER controlled system and c) both voltage regulator and DER controlled system. The simulation results show that systems with both voltage regulators and DER control provide better voltage profile.

  18. Distributed-current-feed and distributed-energy-store railguns

    NASA Astrophysics Data System (ADS)

    Holland, L. D.

    1984-03-01

    In connection with advances in railgun technology evolution toward the development of systems for specific applications, investigations are being conducted regarding a wide variety of power supply and railgun systems. The present study is concerned with the development of the distributed railguns and the introduction of a new type of railgun system specifically designed for applications requiring long accelerators. It is found that the distributed railguns offer a solution to the limits on performance of the breech-fed railguns as the length of the rails becomes large. Attention is given to the pulse-forming network and breech-fed railgun, the breech-fed railgun with parallel pulse-forming network, a distributed-energy-store railgun, a distributed-current-feed (DCF) railgun, and a DCF railgun launcher.

  19. Genomic patterns associated with paternal/maternal distribution of transposable elements

    NASA Astrophysics Data System (ADS)

    Jurka, Jerzy

    2003-03-01

    Transposable elements (TEs) are specialized DNA or RNA fragments capable of surviving in intragenomic niches. They are commonly, perhaps unjustifiably referred to as "selfish" or "parasitic" elements. TEs can be divided in two major classes: retroelements and DNA transposons. The former include non-LTR retrotransposons and retrovirus-like elements, using reverse transriptase for their reproduction prior to integration into host DNA. The latter depend mostly on host DNA replication, with possible exception of rolling-circle transposons recently discovered by our team. I will review basic information on TEs, with emphasis on human Alu and L1 retroelements discussed in the context of genomic organization. TEs are non-randomly distributed in chromosomal DNA. In particular, human Alu elements tend to prefer GC-rich regions, whereas L1 accumulate in AT-rich regions. Current explanations of this phenomenon focus on the so called "target effects" and post-insertional selection. However, the proposed models appear to be unsatisfactory and alternative explanations invoking "channeling" to different chromosomal regions will be a major focus of my presentation. Transposable elements (TEs) can be expressed and integrated into host DNA in the male or female germlines, or both. Different models of expression and integration imply different proportions of TEs on sex chromosomes and autosomes. The density of recently retroposed human Alu elements is around three times higher on chromosome Y than on chromosome X, and over two times higher than the average density for all human autosomes. This implies Alu activity in paternal germlines. Analogous inter-chromosomal proportions for other repeat families should determine their compatibility with one of the three basic models describing the inheritance of TEs. Published evidence indicates that maternally and paternally imprinted genes roughly correspond to GC-rich and AT-rich DNA. This may explain the observed chromosomal distribution of

  20. Exploiting replication in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, T. A.

    1989-01-01

    Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.

  1. The alignment-distribution graph

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert

    1993-01-01

    Implementing a data-parallel language such as Fortran 90 on a distributed-memory parallel computer requires distributing aggregate data objects (such as arrays) among the memory modules attached to the processors. The mapping of objects to the machine determines the amount of residual communication needed to bring operands of parallel operations into alignment with each other. We present a program representation called the alignment-distribution graph that makes these communication requirements explicit. We describe the details of the representation, show how to model communication cost in this framework, and outline several algorithms for determining object mappings that approximately minimize residual communication.

  2. The alignment-distribution graph

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert

    1993-01-01

    Implementing a data-parallel language such as Fortran 90 on a distributed-memory parallel computer requires distributing aggregate data objects (such as arrays) among the memory modules attached to the processors. The mapping of objects to the machine determines the amount of residual communication needed to bring operands of parallel operations into alignment with each other. We present a program representation called the alignment distribution graph that makes these communication requirements explicit. We describe the details of the representation, show how to model communication cost in this framework, and outline several algorithms for determining object mappings that approximately minimize residual communication.

  3. Dose distributions in regions containing beta sources: Irregularly shaped source distributions in homogeneous media

    SciTech Connect

    Werner, B.L. )

    1991-11-01

    Methods are introduced by which dose rate distributions due to nonuniform, irregularly shaped distributions of beta emitters can be calculated using dose rate distributions for uniform, spherical source distributions. The dose rate distributions can be written in the MIRD formalism.

  4. Patterns of white matter damage are non-random and associated with cognitive function in secondary progressive multiple sclerosis.

    PubMed

    Meijer, K A; Cercignani, M; Muhlert, N; Sethi, V; Chard, D; Geurts, J J G; Ciccarelli, O

    2016-01-01

    In multiple sclerosis (MS), white matter damage is thought to contribute to cognitive dysfunction, which is especially prominent in secondary progressive MS (SPMS). While studies in healthy subjects have revealed patterns of correlated fractional anisotropy (FA) across white matter tracts, little is known about the underlying patterns of white matter damage in MS. In the present study, we aimed to map the SPMS-related covariance patterns of microstructural white matter changes, and investigated whether or not these patterns were associated with cognitive dysfunction. Diffusion MRI was acquired from 30 SPMS patients and 32 healthy controls (HC). A tensor model was fitted and FA maps were processed using tract-based spatial statistics (TBSS) in order to obtain a skeletonised map for each subject. The skeletonised FA maps of patients only were decomposed into 18 spatially independent components (ICs) using independent component analysis. Comprehensive cognitive assessment was conducted to evaluate five cognitive domains. Correlations between cognitive performance and (1) severity of FA abnormalities of the extracted ICs (i.e. z-scores relative to FA values of HC) and (2) IC load (i.e. FA covariance of a particular IC) were examined. SPMS patients showed lower FA values of all examined patterns of correlated FA (i.e. spatially independent components) than HC (p < 0.01). Tracts visually assigned to the supratentorial commissural class were most severely damaged (z = - 3.54; p < 0.001). Reduced FA was significantly correlated with reduced IC load (i.e. FA covariance) (r = 0.441; p < 0.05). Lower mean FA and component load of the supratentorial projection tracts and limbic association tracts classes were associated with worse cognitive function, including executive function, working memory and verbal memory. Despite the presence of white matter damage, it was possible to reveal patterns of FA covariance across SPMS patients. This could indicate that white matter tracts belonging to the same cluster, and thus with similar characteristics, tend to follow similar trends during neurodegeneration. Furthermore, these underlying FA patterns might help to explain cognitive dysfunction in SPMS. PMID:27408797

  5. Issues Relating to Selective Reporting When Including Non-Randomized Studies in Systematic Reviews on the Effects of Healthcare Interventions

    ERIC Educational Resources Information Center

    Norris, Susan L.; Moher, David; Reeves, Barnaby C.; Shea, Beverley; Loke, Yoon; Garner, Sarah; Anderson, Laurie; Tugwell, Peter; Wells, George

    2013-01-01

    Background: Selective outcome and analysis reporting (SOR and SAR) occur when only a subset of outcomes measured and analyzed in a study is fully reported, and are an important source of potential bias. Key methodological issues: We describe what is known about the prevalence and effects of SOR and SAR in both randomized controlled trials (RCTs)…

  6. Patterns of white matter damage are non-random and associated with cognitive function in secondary progressive multiple sclerosis.

    PubMed

    Meijer, K A; Cercignani, M; Muhlert, N; Sethi, V; Chard, D; Geurts, J J G; Ciccarelli, O

    2016-01-01

    In multiple sclerosis (MS), white matter damage is thought to contribute to cognitive dysfunction, which is especially prominent in secondary progressive MS (SPMS). While studies in healthy subjects have revealed patterns of correlated fractional anisotropy (FA) across white matter tracts, little is known about the underlying patterns of white matter damage in MS. In the present study, we aimed to map the SPMS-related covariance patterns of microstructural white matter changes, and investigated whether or not these patterns were associated with cognitive dysfunction. Diffusion MRI was acquired from 30 SPMS patients and 32 healthy controls (HC). A tensor model was fitted and FA maps were processed using tract-based spatial statistics (TBSS) in order to obtain a skeletonised map for each subject. The skeletonised FA maps of patients only were decomposed into 18 spatially independent components (ICs) using independent component analysis. Comprehensive cognitive assessment was conducted to evaluate five cognitive domains. Correlations between cognitive performance and (1) severity of FA abnormalities of the extracted ICs (i.e. z-scores relative to FA values of HC) and (2) IC load (i.e. FA covariance of a particular IC) were examined. SPMS patients showed lower FA values of all examined patterns of correlated FA (i.e. spatially independent components) than HC (p < 0.01). Tracts visually assigned to the supratentorial commissural class were most severely damaged (z = - 3.54; p < 0.001). Reduced FA was significantly correlated with reduced IC load (i.e. FA covariance) (r = 0.441; p < 0.05). Lower mean FA and component load of the supratentorial projection tracts and limbic association tracts classes were associated with worse cognitive function, including executive function, working memory and verbal memory. Despite the presence of white matter damage, it was possible to reveal patterns of FA covariance across SPMS patients. This could indicate that white matter tracts belonging to the same cluster, and thus with similar characteristics, tend to follow similar trends during neurodegeneration. Furthermore, these underlying FA patterns might help to explain cognitive dysfunction in SPMS.

  7. Short-term results after STARR versus internal Delorme for obstructed defecation: a non-randomized prospective study.

    PubMed

    Ohazuruike, N L; Martellucci, J; Menconi, C; Panicucci, S; Toniolo, G; Naldini, G

    2014-06-01

    Obstructed defecation syndrome due to internal intussusception and rectocele is a common disease, and various transanal surgical techniques have been proposed. Aim of the present study was to compare the internal Delorme (ID) and the stapled transanal rectal resection (STARR) results in the treatment of patients with obstructed defecation syndrome. From September 2011 to May 2012, 23 patients were operated with STARR procedure and 12 patients with Delorme's procedure for obstructed defecation syndrome. All patients underwent preoperative assessment: clinical evaluation (Altomare ODS score, Wexner constipation scoring system), proctoscopy, defecography, anorectal manometry and endoanal ultrasonography. Surgery was proposed with: failure of medical therapy, incomplete defecation, and unsuccessful attempts with long periods spent in bathroom, defecation with digital assistance, use of enemas and defecography findings of rectoanal intussusception and rectocele. The average operative time was 28 min (range 15-65) for the STARR group and 56 min (range 28-96) for the ID group with a mean hospital stay of 2 days for both the procedures. The Wexner score significantly fell postoperatively from 17 to 4, 7 in STARR group and from 15.3 to 3.3 in the ID group. The Altomare score postoperatively fell from 18.2 to 5.5 for STARR group and from 16.5 to 5.3 for ID group. No statistically significant differences were observed between the two procedures considering the outcomes parameters and the complications. Both ID and STARR procedure seem to be effective in the treatment of ODS.

  8. The non-random walk of stock prices: the long-term correlation between signs and sizes

    NASA Astrophysics Data System (ADS)

    La Spada, G.; Farmer, J. D.; Lillo, F.

    2008-08-01

    We investigate the random walk of prices by developing a simple model relating the properties of the signs and absolute values of individual price changes to the diffusion rate (volatility) of prices at longer time scales. We show that this benchmark model is unable to reproduce the diffusion properties of real prices. Specifically, we find that for one hour intervals this model consistently over-predicts the volatility of real price series by about 70%, and that this effect becomes stronger as the length of the intervals increases. By selectively shuffling some components of the data while preserving others we are able to show that this discrepancy is caused by a subtle but long-range non-contemporaneous correlation between the signs and sizes of individual returns. We conjecture that this is related to the long-memory of transaction signs and the need to enforce market efficiency.

  9. Effect of an analgo-sedation protocol for neurointensive patients: a two-phase interventional non-randomized pilot study

    PubMed Central

    2010-01-01

    Introduction Sedation protocols are needed for neurointensive patients. The aim of this pilot study was to describe sedation practice at a neurointensive care unit and to assess the feasibility and efficacy of a new sedation protocol. The primary outcomes were a shift from sedation-based to analgesia-based sedation and improved pain management. The secondary outcomes were a reduction in unplanned extubations and duration of sedation. Methods This was a two-phase (before-after), prospective controlled study at a university-affiliated, 14-bed neurointensive care unit in Denmark. The sample included patients requiring mechanical ventilation for at least 48 hours treated with continuous sedative and analgesic infusions or both. During the observation phase the participants (n = 106) were sedated as usual (non-protocolized), and during the intervention phase the participants (n = 109) were managed according to a new sedation protocol. Results Our study showed a shift toward analgo-sedation, suggesting feasibility of the protocol. We found a significant reduction in the use of propofol (P < .001) and midazolam (P = .001) and an increase in fentanyl (P < .001) and remifentanil (P = .003). Patients selected for daily sedation interruption woke up faster, and estimates of pain free patients increased from 56.8% to 82.7% (P < .001), suggesting efficacy of the protocol. The duration of sedation and unplanned extubations were unchanged. Conclusions Our pilot study showed feasibility and partial efficacy of our protocol. Some neurointensive patients might not benefit from protocolized practice. We recommend an interdisciplinary effort to target patients requiring less sedation, as issues of oversedation and inadequate pain management still need more attention. Trial registration ISRCTN80999859. PMID:20403186

  10. A fast, easy circumcision procedure combining a CO2 laser and cyanoacrylate adhesive: a non-randomized comparative trial

    PubMed Central

    Gorgulu, Tahsin; Olgun, Abdulkerim; Torun, Merve; Kargi, Eksal

    2016-01-01

    ABSTRACT Background Circumcision is performed as a routine operation in many countries, more commonly for religious and cultural reasons than for indicated conditions, such as phimosis and balanitis. There are many techniques available, and recently electrocautery and both Nd:YAG and CO2 lasers, instead of blades, have been used for skin and mucosal incisions. However, the infection risk in circumcisions performed using a CO2 laser was 10% higher. There are also reports of sutureless procedures using cyanoacrylate, but these have higher risks of hematoma and hemorrhage. We combined a CO2 laser and cyanoacrylate to shorten the operation time and to decrease bleeding complications. Materials and Methods : Circumcisions were performed under general anesthesia with CO2 laser and cyanoacrylate combination in 75 6–9-year-old boys between May 2013 and August 2014 only for religious reasons. As a control, we compared them retrospectively with 75 age-matched patients who were circumcised using the conventional guillotine method in our clinic. Results No hematomas, bleeding, or wound infections were observed. One wound dehiscence (1.33%) occurred during the early postoperative period and healed without any additional procedures. The median operating time was 7 (range 6–9) minutes. The conventional guillotine group comprised one hematoma (1.3%), two wound dehiscences (2.6%), and two hemorrhages (2.6%), and the median operating time was 22 (range 20–26) minutes. The difference in surgical time was significant (p<0.001), with no significant difference in the rate of complications between the two groups. Conclusion The combined CO2 laser and cyanoacrylate procedure not only decreased the operating time markedly, but also eliminated the disadvantages associated with each individual procedure alone. PMID:27136476

  11. Performance of distributed multiscale simulations

    PubMed Central

    Borgdorff, J.; Ben Belgacem, M.; Bona-Casas, C.; Fazendeiro, L.; Groen, D.; Hoenen, O.; Mizeranschi, A.; Suter, J. L.; Coster, D.; Coveney, P. V.; Dubitzky, W.; Hoekstra, A. G.; Strand, P.; Chopard, B.

    2014-01-01

    Multiscale simulations model phenomena across natural scales using monolithic or component-based code, running on local or distributed resources. In this work, we investigate the performance of distributed multiscale computing of component-based models, guided by six multiscale applications with different characteristics and from several disciplines. Three modes of distributed multiscale computing are identified: supplementing local dependencies with large-scale resources, load distribution over multiple resources, and load balancing of small- and large-scale resources. We find that the first mode has the apparent benefit of increasing simulation speed, and the second mode can increase simulation speed if local resources are limited. Depending on resource reservation and model coupling topology, the third mode may result in a reduction of resource consumption. PMID:24982258

  12. Visualizing Spatially Varying Distribution Data

    NASA Technical Reports Server (NTRS)

    Kao, David; Luo, Alison; Dungan, Jennifer L.; Pang, Alex; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Box plot is a compact representation that encodes the minimum, maximum, mean, median, and quarters information of a distribution. In practice, a single box plot is drawn for each variable of interest. With the advent of more accessible computing power, we are now facing the problem of visual icing data where there is a distribution at each 2D spatial location. Simply extending the box plot technique to distributions over 2D domain is not straightforward. One challenge is reducing the visual clutter if a box plot is drawn over each grid location in the 2D domain. This paper presents and discusses two general approaches, using parametric statistics and shape descriptors, to present 2D distribution data sets. Both approaches provide additional insights compared to the traditional box plot technique

  13. Multiple complementary gas distribution assemblies

    DOEpatents

    Ng, Tuoh-Bin; Melnik, Yuriy; Pang, Lily L; Tuncel, Eda; Nguyen, Son T; Chen, Lu

    2016-04-05

    In one embodiment, an apparatus includes a first gas distribution assembly that includes a first gas passage for introducing a first process gas into a second gas passage that introduces the first process gas into a processing chamber and a second gas distribution assembly that includes a third gas passage for introducing a second process gas into a fourth gas passage that introduces the second process gas into the processing chamber. The first and second gas distribution assemblies are each adapted to be coupled to at least one chamber wall of the processing chamber. The first gas passage is shaped as a first ring positioned within the processing chamber above the second gas passage that is shaped as a second ring positioned within the processing chamber. The gas distribution assemblies may be designed to have complementary characteristic radial film growth rate profiles.

  14. The Binomial Distribution in Shooting

    ERIC Educational Resources Information Center

    Chalikias, Miltiadis S.

    2009-01-01

    The binomial distribution is used to predict the winner of the 49th International Shooting Sport Federation World Championship in double trap shooting held in 2006 in Zagreb, Croatia. The outcome of the competition was definitely unexpected.

  15. 2013 Distributed Wind Market Report

    SciTech Connect

    Orrell, Alice C.; Rhoads-Weaver, H. E.; Flowers, Larry T.; Gagne, Matthew N.; Pro, Boyd H.; Foster, Nikolas AF

    2014-08-20

    The purpose of this report is to quantify and summarize the 2013 U.S. distributed wind market to help plan and guide future investments and decisions by industry stakeholders, utilities, state and federal agencies, and other interested parties.

  16. BESIII production with distributed computing

    NASA Astrophysics Data System (ADS)

    Zhang, X. M.; Yan, T.; Zhao, X. H.; Ma, Z. T.; Yan, X. F.; Lin, T.; Deng, Z. Y.; Li, W. D.; Belov, S.; Pelevanyuk, I.; Zhemchugov, A.; Cai, H.

    2015-12-01

    Distributed computing is necessary nowadays for high energy physics experiments to organize heterogeneous computing resources all over the world to process enormous amounts of data. The BESIII experiment in China, has established its own distributed computing system, based on DIRAC, as a supplement to local clusters, collecting cluster, grid, desktop and cloud resources from collaborating member institutes around the world. The system consists of workload management and data management to deal with the BESIII Monte Carlo production workflow in a distributed environment. A dataset-based data transfer system has been developed to support data movements among sites. File and metadata management tools and a job submission frontend have been developed to provide a virtual layer for BESIII physicists to use distributed resources. Moreover, the paper shows the experience to cope with lack of grid experience and low manpower among the BESIII community.

  17. Distributed teaming on JPL projects

    NASA Technical Reports Server (NTRS)

    Baroff, L. E.

    2002-01-01

    This paper addresses structures, actions and technologies that contribute to real team development of a distributed team, and the leadership skills and tools that are used to implement that team development.

  18. Reactive Power from Distributed Energy

    SciTech Connect

    Kueck, John; Kirby, Brendan; Rizy, Tom; Li, Fangxing; Fall, Ndeye

    2006-12-15

    Distributed energy is an attractive option for solving reactive power and distribution system voltage problems because of its proximity to load. But the cost of retrofitting DE devices to absorb or produce reactive power needs to be reduced. There also needs to be a market mechanism in place for ISOs, RTOs, and transmission operators to procure reactive power from the customer side of the meter where DE usually resides. (author)

  19. Generalized parton distributions in nuclei

    SciTech Connect

    Vadim Guzey

    2009-12-01

    Generalized parton distributions (GPDs) of nuclei describe the distribution of quarks and gluons in nuclei probed in hard exclusive reactions, such as e.g. deeply virtual Compton scattering (DVCS). Nuclear GPDs and nuclear DVCS allow us to study new aspects of many traditional nuclear effects (nuclear shadowing, EMC effect, medium modifications of the bound nucleons) as well as to access novel nuclear effects. In my talk, I review recent theoretical progress in the area of nuclear GPDs.

  20. Distributed processing for speech understanding

    SciTech Connect

    Bronson, E.C.; Siegel, L.

    1983-01-01

    Continuous speech understanding is a highly complex artificial intelligence task requiring extensive computation. This complexity precludes real-time speech understanding on a conventional serial computer. Distributed processing technique can be applied to the speech understanding task to improve processing speed. In the paper, the speech understanding task and several speech understanding systems are described. Parallel processing techniques are presented and a distributed processing architecture for speech understanding is outlined. 35 references.

  1. Structure functions and parton distributions

    SciTech Connect

    Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1995-07-01

    The MRS parton distribution analysis is described. The latest sets are shown to give an excellent description of a wide range of deep-inelastic and other hard scattering data. Two important theoretical issues-the behavior of the distributions at small x and the flavor structure of the quark sea-are discussed in detail. A comparison with the new structure function data from HERA is made, and the outlook for the future is discussed.

  2. Modeling Nucleon Generalized Parton Distributions

    SciTech Connect

    Radyushkin, Anatoly V.

    2013-05-01

    We discuss building models for nucleon generalized parton distributions (GPDs) H and E that are based on the formalism of double distributions (DDs). We find that the usual "DD+D-term'' construction should be amended by an extra term, generated by GPD E(x,\\xi). Unlike the $D$-term, this function has support in the whole -1 < x< 1 region, and in general does not vanish at the border points|x|=\\xi.

  3. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2010-09-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the COS MAMA Fold Analysis {11891} during Cycle 17.

  4. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2012-10-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the COS MAMA Fold Analysis {12723} during Cycle 19.

  5. Disruption of Transcriptional Coactivator Sub1 Leads to Genome-Wide Re-distribution of Clustered Mutations Induced by APOBEC in Active Yeast Genes.

    PubMed

    Lada, Artem G; Kliver, Sergei F; Dhar, Alok; Polev, Dmitrii E; Masharsky, Alexey E; Rogozin, Igor B; Pavlov, Youri I

    2015-05-01

    Mutations in genomes of species are frequently distributed non-randomly, resulting in mutation clusters, including recently discovered kataegis in tumors. DNA editing deaminases play the prominent role in the etiology of these mutations. To gain insight into the enigmatic mechanisms of localized hypermutagenesis that lead to cluster formation, we analyzed the mutational single nucleotide variations (SNV) data obtained by whole-genome sequencing of drug-resistant mutants induced in yeast diploids by AID/APOBEC deaminase and base analog 6-HAP. Deaminase from sea lamprey, PmCDA1, induced robust clusters, while 6-HAP induced a few weak ones. We found that PmCDA1, AID, and APOBEC1 deaminases preferentially mutate the beginning of the actively transcribed genes. Inactivation of transcription initiation factor Sub1 strongly reduced deaminase-induced can1 mutation frequency, but, surprisingly, did not decrease the total SNV load in genomes. However, the SNVs in the genomes of the sub1 clones were re-distributed, and the effect of mutation clustering in the regions of transcription initiation was even more pronounced. At the same time, the mutation density in the protein-coding regions was reduced, resulting in the decrease of phenotypically detected mutants. We propose that the induction of clustered mutations by deaminases involves: a) the exposure of ssDNA strands during transcription and loss of protection of ssDNA due to the depletion of ssDNA-binding proteins, such as Sub1, and b) attainment of conditions favorable for APOBEC action in subpopulation of cells, leading to enzymatic deamination within the currently expressed genes. This model is applicable to both the initial and the later stages of oncogenic transformation and explains variations in the distribution of mutations and kataegis events in different tumor cells. PMID:25941824

  6. Distribution and geological control of mud volcanoes and other fluid/free gas seepage features in the Mediterranean Sea and nearby Gulf of Cadiz

    NASA Astrophysics Data System (ADS)

    Mascle, Jean; Mary, Flore; Praeg, Daniel; Brosolo, Laetitia; Camera, Laurent; Ceramicola, Silvia; Dupré, Stéphanie

    2014-06-01

    Existing knowledge on the distribution of mud volcanoes (MVs) and other significant fluid/free gas-venting features (mud cones, mud pies, mud-brine pools, mud carbonate cones, gas chimneys and, in some cases, pockmark fields) discovered on the seafloor of the Mediterranean Sea and in the nearby Gulf of Cadiz has been compiled using regional geophysical information (including multibeam coverage of most deepwater areas). The resulting dataset comprises both features proven from geological sampling, or in situ observations, and many previously unrecognized MVs inferred from geophysical evidence. The synthesis reveals that MVs clearly have non-random distributions that correspond to two main geodynamic settings: (1) the vast majority occur along the various tectono-sedimentary accretionary wedges of the Africa-Eurasia subduction zone, particularly in the central and eastern Mediterranean basins (external Calabrian Arc, Mediterranean Ridge, Florence Rise) but also along its westernmost boundary in the Gulf of Cadiz; (2) other MVs characterize thick depocentres along parts of the Mesozoic passive continental margins that border Africa from eastern Tunisia to the Levantine coasts, particularly off Egypt and, locally, within some areas of the western Mediterranean back-arc basins. Meaningfully accounting for MV distribution necessitates evidence of overpressured fluids and mud-rich layers. In addition, cross-correlations between MVs and other GIS-based data, such as maps of the Messinian evaporite basins and/or active (or recently active) tectonic trends, stress the importance of assessing geological control in terms of the presence, or not, of thick seals and potential conduits. It is contended that new MV discoveries may be expected in the study region, particularly along the southern Ionian Sea continental margins.

  7. Disruption of Transcriptional Coactivator Sub1 Leads to Genome-Wide Re-distribution of Clustered Mutations Induced by APOBEC in Active Yeast Genes.

    PubMed

    Lada, Artem G; Kliver, Sergei F; Dhar, Alok; Polev, Dmitrii E; Masharsky, Alexey E; Rogozin, Igor B; Pavlov, Youri I

    2015-05-01

    Mutations in genomes of species are frequently distributed non-randomly, resulting in mutation clusters, including recently discovered kataegis in tumors. DNA editing deaminases play the prominent role in the etiology of these mutations. To gain insight into the enigmatic mechanisms of localized hypermutagenesis that lead to cluster formation, we analyzed the mutational single nucleotide variations (SNV) data obtained by whole-genome sequencing of drug-resistant mutants induced in yeast diploids by AID/APOBEC deaminase and base analog 6-HAP. Deaminase from sea lamprey, PmCDA1, induced robust clusters, while 6-HAP induced a few weak ones. We found that PmCDA1, AID, and APOBEC1 deaminases preferentially mutate the beginning of the actively transcribed genes. Inactivation of transcription initiation factor Sub1 strongly reduced deaminase-induced can1 mutation frequency, but, surprisingly, did not decrease the total SNV load in genomes. However, the SNVs in the genomes of the sub1 clones were re-distributed, and the effect of mutation clustering in the regions of transcription initiation was even more pronounced. At the same time, the mutation density in the protein-coding regions was reduced, resulting in the decrease of phenotypically detected mutants. We propose that the induction of clustered mutations by deaminases involves: a) the exposure of ssDNA strands during transcription and loss of protection of ssDNA due to the depletion of ssDNA-binding proteins, such as Sub1, and b) attainment of conditions favorable for APOBEC action in subpopulation of cells, leading to enzymatic deamination within the currently expressed genes. This model is applicable to both the initial and the later stages of oncogenic transformation and explains variations in the distribution of mutations and kataegis events in different tumor cells.

  8. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  9. Double distributions and evolution equations

    SciTech Connect

    A.V. Radyushkin

    1998-05-01

    Applications of perturbative QCD to deeply virtual Compton scattering and hard exclusive meson electroproduction processes require a generalization of usual parton distributions for the case when long-distance information is accumulated in nonforward matrix elements < p{prime} {vert_bar}O(0,z){vert_bar}p > of quark and gluon light-cone operators. In their previous papers the authors used two types of nonperturbative functions parameterizing such matrix elements: double distributions F(x,y;t) and nonforward distribution functions F{sub {zeta}}(X;t). Here they discuss in more detail the double distributions (DD's) and evolution equations which they satisfy. They propose simple models for F(x,y;t=0) DD's with correct spectral and symmetry properties which also satisfy the reduction relations connecting them to the usual parton densities f(x). In this way, they obtain self-consistent models for the {zeta}-dependence of nonforward distributions. They show that, for small {zeta}, one can easily obtain nonforward distributions (in the X > {zeta} region) from the parton densities: F{sub {zeta}} (X;t=0) {approx} f(X{minus}{zeta}/2).

  10. Modeled ground water age distributions

    USGS Publications Warehouse

    Woolfenden, Linda R.; Ginn, Timothy R.

    2009-01-01

    The age of ground water in any given sample is a distributed quantity representing distributed provenance (in space and time) of the water. Conventional analysis of tracers such as unstable isotopes or anthropogenic chemical species gives discrete or binary measures of the presence of water of a given age. Modeled ground water age distributions provide a continuous measure of contributions from different recharge sources to aquifers. A numerical solution of the ground water age equation of Ginn (1999) was tested both on a hypothetical simplified one-dimensional flow system and under real world conditions. Results from these simulations yield the first continuous distributions of ground water age using this model. Complete age distributions as a function of one and two space dimensions were obtained from both numerical experiments. Simulations in the test problem produced mean ages that were consistent with the expected value at the end of the model domain for all dispersivity values tested, although the mean ages for the two highest dispersivity values deviated slightly from the expected value. Mean ages in the dispersionless case also were consistent with the expected mean ages throughout the physical model domain. Simulations under real world conditions for three dispersivity values resulted in decreasing mean age with increasing dispersivity. This likely is a consequence of an edge effect. However, simulations for all three dispersivity values tested were mass balanced and stable demonstrating that the solution of the ground water age equation can provide estimates of water mass density distributions over age under real world conditions.

  11. Where to nest? Ecological determinants of chimpanzee nest abundance and distribution at the habitat and tree species scale.

    PubMed

    Carvalho, Joana S; Meyer, Christoph F J; Vicente, Luis; Marques, Tiago A

    2015-02-01

    Conversion of forests to anthropogenic land-uses increasingly subjects chimpanzee populations to habitat changes and concomitant alterations in the plant resources available to them for nesting and feeding. Based on nest count surveys conducted during the dry season, we investigated nest tree species selection and the effect of vegetation attributes on nest abundance of the western chimpanzee, Pan troglodytes verus, at Lagoas de Cufada Natural Park (LCNP), Guinea-Bissau, a forest-savannah mosaic widely disturbed by humans. Further, we assessed patterns of nest height distribution to determine support for the anti-predator hypothesis. A zero-altered generalized linear mixed model showed that nest abundance was negatively related to floristic diversity (exponential form of the Shannon index) and positively with the availability of smaller-sized trees, reflecting characteristics of dense-canopy forest. A positive correlation between nest abundance and floristic richness (number of plant species) and composition indicated that species-rich open habitats are also important in nest site selection. Restricting this analysis to feeding trees, nest abundance was again positively associated with the availability of smaller-sized trees, further supporting the preference for nesting in food tree species from dense forest. Nest tree species selection was non-random, and oil palms were used at a much lower proportion (10%) than previously reported from other study sites in forest-savannah mosaics. While this study suggests that human disturbance may underlie the exclusive arboreal nesting at LCNP, better quantitative data are needed to determine to what extent the construction of elevated nests is in fact a response to predators able to climb trees. Given the importance of LCNP as refuge for Pan t. verus our findings can improve conservation decisions for the management of this important umbrella species as well as its remaining suitable habitats. PMID:25224379

  12. Distribution System of the Future

    SciTech Connect

    Kueck, JD

    2003-04-23

    The distribution system of the future is going to be as much of a revolution to the electric energy industry as the wireless telephone has been to consumer communications. An electricity market transformation must occur before the changes can take place, but this evolution is already starting to occur in many parts of the country. In this paper, we discuss a vision for a future distribution system, areas that will be key for technology development, and the advantages of the new electricity market. Present day distribution systems are in a sense, unintelligent. Distribution systems respond to faults, or short circuits, by sensing the very high fault current and then opening circuit breakers to isolate the fault. Some newer automated systems determine fault location and then close other circuit breakers to provide an alternate path for power after the fault so that the number of customers left without power is minimized, but the extent of the reconfiguration is limited. Distribution systems also have some methods to regulate voltage, but there is little real time local response to contingencies such as loss of a transmission line or a generator. In present day distribution systems, there is very little control of load, or demand response, and Distributed Energy Resources (DER, distributed generation, storage, and responsive load) located in the distribution system are prohibited from even regulating voltage. In fact, industry standards and utility interconnection agreements typically require that when a contingency occurs on a distribution or transmission system that results in a voltage or frequency excursion, the DER is to disconnect rather than help. There is a pressing need to evolve the distribution system model to one that can respond to contingencies sensed locally, and has the local intelligence and autonomy to deal with contingencies such as unusual loading, transmission congestion, and line outages. Markets must be simple for customers to participate in the

  13. SCDM in a Distributed Environment

    NASA Technical Reports Server (NTRS)

    Crowley, Sandra L.; Housch, Helen I.; Madison, Heather L.

    2004-01-01

    The Software Configuration Management (SCM) of the Space Launch Initiative (SLI) Advanced Engineering Environment (AEE) products is performed in a distributed environment-meaning the activities performed during the project lifecycle are across numerous NASA Centers, facilities, organizations, colleges and industry. SCM is the glue that holds the project and products together-especially in a distributed environment. It identifies, controls, accounts, and verified the details of the products; the schedule of activities; the assigned responsibilities; and the required resources, including staff, tools, and computer facilities. Data/document management (DM) captures and conveys the SCM and project efforts. SCM and DM are integrally linked; hence, Software Configuration and Data Management (SCDM). This paper discusses one team's challenges in implementing SCDM in a distributed environment. The distributed nature of the project introduces new opportunities for moving SCDM to the next level of usefulness in today's high-tech development arena. The lessons learned from the implementation of distributed SCDM in support of the SLI AEE Project provide valuable information for future implementations of SCM and DM.

  14. Knowledge in a distributed environment

    SciTech Connect

    Moses, Y.O.

    1986-01-01

    The distributed nature of information in a distributed system is one of the major issues that protocols for cooperation and coordination between individual components in such a system must handle. Individual sites customarily have only partial knowledge about the general state of the system. Moreover, different information is available at the different sites of the system. Consequently, a central role of communication in such protocols is to inform particular sites about events that take place at other sites, and to transform the system's state of knowledge in a way that will guarantee the successful achievement of the goals of the protocol. This thesis is an initial attempt to study the role of knowledge in distributed system. A general framework is presented for defining knowledge in a distributed system, and a variety of states of knowledge are identified that groups of processors may have. These states of knowledge seem to capture basic aspects of coordinated actions in a distributed environment. This machinery is applied to the analysis of a number of problems. Finally, this machinery is applied to the study of fault tolerance in systems of unreliable processors, providing considerable insight into the Byzantine agreement problem, and obtaining improved protocols for Byzantine agreement and many related problems.

  15. Distributed Wind Policy Comparison Tool

    SciTech Connect

    2011-12-01

    Power through Policy: 'Best Practices' for Cost-Effective Distributed Wind is a U.S. Department of Energy (DOE)-funded project to identify distributed wind technology policy best practices and to help policymakers, utilities, advocates, and consumers examine their effectiveness using a pro forma model. Incorporating a customized feed from the Database of State Incentives for Renewables and Efficiency (DSIRE), the Web-based Distributed Wind Policy Comparison Tool (Policy Tool) is designed to assist state, local, and utility officials in understanding the financial impacts of different policy options to help reduce the cost of distributed wind technologies. The Policy Tool can be used to evaluate the ways that a variety of federal and state policies and incentives impact the economics of distributed wind (and subsequently its expected market growth). It also allows policymakers to determine the impact of policy options, addressing market challenges identified in the U.S. DOE’s '20% Wind Energy by 2030' report and helping to meet COE targets.

  16. Size distribution of ring polymers

    PubMed Central

    Medalion, Shlomi; Aghion, Erez; Meirovitch, Hagai; Barkai, Eli; Kessler, David A.

    2016-01-01

    We present an exact solution for the distribution of sample averaged monomer to monomer distance of ring polymers. For non-interacting and local-interaction models these distributions correspond to the distribution of the area under the reflected Bessel bridge and the Bessel excursion respectively, and are shown to be identical in dimension d ≥ 2, albeit with pronounced finite size effects at the critical dimension, d = 2. A symmetry of the problem reveals that dimension d and 4 − d are equivalent, thus the celebrated Airy distribution describing the areal distribution of the d = 1 Brownian excursion describes also a polymer in three dimensions. For a self-avoiding polymer in dimension d we find numerically that the fluctuations of the scaled averaged distance are nearly identical in dimension d = 2, 3 and are well described to a first approximation by the non-interacting excursion model in dimension 5. PMID:27302596

  17. Distributed resource management: garbage collection

    SciTech Connect

    Bagherzadeh, N.

    1987-01-01

    In recent years, there has been a great interest in designing high-performance distributed symbolic-processing computers. These architectures have special needs for resource management and dynamic reclamation of unused memory cells and objects. The memory management or garbage-collection aspects of these architectures are studied. Also introduced is a synchronous distributed algorithm for garbage collection. A special data structure is defined to handle the distributed nature of the problem. The author formally expresses the algorithm and shows the results of a synchronous garbage-collection simulation and its effect on the interconnection-network message to traffic. He presents an asynchronous distributed garbage collection to handle the resource management for a system that does not require a global synchronization mechanism. The distributed data structure is modified to include the asynchronous aspects of the algorithm. This method is extended to a multiple-mutator scheme, and the problem of having several processors share portion of a cyclical graph is discussed. Two models for the analytical study of the garbage-collection algorithms discussed are provided.

  18. Phenomenology of preequilibrium angular distributions

    SciTech Connect

    Kalbach, C.; Mann, F.M.

    1980-05-01

    The systematics of continuum angular distributions from a wide variety of light ion nuclear reactions have been studied. To first order, the shape of the angular distributions have been found to depend only on the energy of the outgoing particle and on the division of the cross section into multi-step direct and multi-step compound parts. The angular distributions can be described in terms of Legendre polynomials with the reduced polynomial coefficients exhibiting a simple dependence on the outgoing particle energy. Two integer and four continuous parameters with universal values are needed to describe the coefficients for outgoing energies of 2 to 60 MeV in all the reaction types studied. This parameterization combined with a modified Griffin model computer code permits the calculation of double differential cross sections for light ion continuum reactions where no data is available.

  19. Overdispersion: Notes on discrete distributions

    SciTech Connect

    Bowman, K.O. ); Shenton, L.R. ); Kastenbaum, M.A. ); Broman, K. )

    1992-09-01

    We introduce mixtures of binomial distributions derived by assuming that the probability parameter p varies according to some law. We use the transformation p = exp([minus]t) and consider various appropriate densities for the transformed variables. In the process, the Laplace transform becomes the fundamental entity. Large numbers of new binomial mixtures are generated in this way. Some transformations may involve several variates that lead to multivariate'' binomial mixtures. An extension of this to the logarithmic distribution, with parameter p, is possible. Frullani integrals and Laplace transforms are encountered. Graphical representations of some of the more significant distributions are given. These include probability functions, regions of validity, and three dimensional representations of probability functions showing the response to variation of parameters when two parameters are involved.

  20. Overdispersion: Notes on discrete distributions

    SciTech Connect

    Bowman, K.O.; Shenton, L.R.; Kastenbaum, M.A.; Broman, K.

    1992-09-01

    We introduce mixtures of binomial distributions derived by assuming that the probability parameter p varies according to some law. We use the transformation p = exp({minus}t) and consider various appropriate densities for the transformed variables. In the process, the Laplace transform becomes the fundamental entity. Large numbers of new binomial mixtures are generated in this way. Some transformations may involve several variates that lead to ``multivariate`` binomial mixtures. An extension of this to the logarithmic distribution, with parameter p, is possible. Frullani integrals and Laplace transforms are encountered. Graphical representations of some of the more significant distributions are given. These include probability functions, regions of validity, and three dimensional representations of probability functions showing the response to variation of parameters when two parameters are involved.

  1. Distribution Workshop 3. Final report

    SciTech Connect

    Sullivan, S.

    1991-12-01

    Over 40 utilities participated in Distribution Workshop III sponsored by the Gas Research Institute's Municipal Gas System Advisory Committee (MGSAC) and held in New Orleans, November 30-December 1, 1990. The primary objective of DWIII was to assist GRI in determining the major needs and concerns the gas distribution industry is currently facing or anticipating in the future. Eight topical areas created the basic structure for the focus group sessions: New equipment decisions; the emergency situation; dealing with leaks; communications in the Information Age; technology transfer; main and service extension; repair, renovation, and maintenance; and customer service. Both topic-specific needs, and more general, cross-cutting 'macro' themes representing social, industry, technology, and process issues, were identified as challenges facing the industry. Two additional groups, comprised of GRI staff and GRI's Distribution Project Advisor Group (DPAG), were assembled to examine the findings in order to generate and prioritize potential research ideas.

  2. Maintaining consistency in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  3. Integrated Transmission and Distribution Control

    SciTech Connect

    Kalsi, Karanjit; Fuller, Jason C.; Tuffner, Francis K.; Lian, Jianming; Zhang, Wei; Marinovici, Laurentiu D.; Fisher, Andrew R.; Chassin, Forrest S.; Hauer, Matthew L.

    2013-01-16

    Distributed, generation, demand response, distributed storage, smart appliances, electric vehicles and renewable energy resources are expected to play a key part in the transformation of the American power system. Control, coordination and compensation of these smart grid assets are inherently interlinked. Advanced control strategies to warrant large-scale penetration of distributed smart grid assets do not currently exist. While many of the smart grid technologies proposed involve assets being deployed at the distribution level, most of the significant benefits accrue at the transmission level. The development of advanced smart grid simulation tools, such as GridLAB-D, has led to a dramatic improvement in the models of smart grid assets available for design and evaluation of smart grid technology. However, one of the main challenges to quantifying the benefits of smart grid assets at the transmission level is the lack of tools and framework for integrating transmission and distribution technologies into a single simulation environment. Furthermore, given the size and complexity of the distribution system, it is crucial to be able to represent the behavior of distributed smart grid assets using reduced-order controllable models and to analyze their impacts on the bulk power system in terms of stability and reliability. The objectives of the project were to: • Develop a simulation environment for integrating transmission and distribution control, • Construct reduced-order controllable models for smart grid assets at the distribution level, • Design and validate closed-loop control strategies for distributed smart grid assets, and • Demonstrate impact of integrating thousands of smart grid assets under closed-loop control demand response strategies on the transmission system. More specifically, GridLAB-D, a distribution system tool, and PowerWorld, a transmission planning tool, are integrated into a single simulation environment. The integrated environment

  4. Kinetic narrowing of size distribution

    NASA Astrophysics Data System (ADS)

    Dubrovskii, V. G.

    2016-05-01

    We present a model that reveals an interesting possibility for narrowing the size distribution of nanostructures when the deterministic growth rate changes its sign from positive to negative at a certain stationary size. Such a behavior occurs in self-catalyzed one-dimensional III-V nanowires and more generally whenever a negative "adsorption-desorption" term in the growth rate is compensated by a positive "diffusion flux." By asymptotically solving the Fokker-Planck equation, we derive an explicit representation for the size distribution that describes either Poissonian broadening or self-regulated narrowing depending on the parameters. We show how the fluctuation-induced spreading of the size distribution can be completely suppressed in systems with size self-stabilization. These results can be used for obtaining size-uniform ensembles of different nanostructures.

  5. Distributed systems status and control

    NASA Technical Reports Server (NTRS)

    Kreidler, David; Vickers, David

    1990-01-01

    Concepts are investigated for an automated status and control system for a distributed processing environment. System characteristics, data requirements for health assessment, data acquisition methods, system diagnosis methods and control methods were investigated in an attempt to determine the high-level requirements for a system which can be used to assess the health of a distributed processing system and implement control procedures to maintain an accepted level of health for the system. A potential concept for automated status and control includes the use of expert system techniques to assess the health of the system, detect and diagnose faults, and initiate or recommend actions to correct the faults. Therefore, this research included the investigation of methods by which expert systems were developed for real-time environments and distributed systems. The focus is on the features required by real-time expert systems and the tools available to develop real-time expert systems.

  6. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  7. Distributed phased array architecture study

    NASA Technical Reports Server (NTRS)

    Bourgeois, Brian

    1987-01-01

    Variations in amplifiers and phase shifters can cause degraded antenna performance, depending also on the environmental conditions and antenna array architecture. The implementation of distributed phased array hardware was studied with the aid of the DISTAR computer program as a simulation tool. This simulation provides guidance in hardware simulation. Both hard and soft failures of the amplifiers in the T/R modules are modeled. Hard failures are catastrophic: no power is transmitted to the antenna elements. Noncatastrophic or soft failures are modeled as a modified Gaussian distribution. The resulting amplitude characteristics then determine the array excitation coefficients. The phase characteristics take on a uniform distribution. Pattern characteristics such as antenna gain, half power beamwidth, mainbeam phase errors, sidelobe levels, and beam pointing errors were studied as functions of amplifier and phase shifter variations. General specifications for amplifier and phase shifter tolerances in various architecture configurations for C band and S band were determined.

  8. How robust are distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1989-01-01

    A distributed system is made up of large numbers of components operating asynchronously from one another and hence with imcomplete and inaccurate views of one another's state. Load fluctuations are common as new tasks arrive and active tasks terminate. Jointly, these aspects make it nearly impossible to arrive at detailed predictions for a system's behavior. It is important to the successful use of distributed systems in situations in which humans cannot provide the sorts of predictable realtime responsiveness of a computer, that the system be robust. The technology of today can too easily be affected by worn programs or by seemingly trivial mechanisms that, for example, can trigger stock market disasters. Inventors of a technology have an obligation to overcome flaws that can exact a human cost. A set of principles for guiding solutions to distributed computing problems is presented.

  9. What makes distributed practice effective?

    PubMed Central

    Benjamin, Aaron S.; Tullis, Jonathan

    2010-01-01

    The advantages provided to memory by the distribution of multiple practice or study opportunities are among the most powerful effects in memory research. In this paper, we critically review the class of theories that presume contextual or encoding variability as the sole basis for the advantages of distributed practice, and recommend an alternative approach based on the idea that some study events remind learners of other study events. Encoding variability theory encounters serious challenges in two important phenomena that we review here: superadditivity and nonmonotonicity. The bottleneck in such theories lies in the assumption that mnemonic benefits arise from the increasing independence, rather than interdependence, of study opportunities. The reminding model accounts for many basic results in the literature on distributed practice, readily handles data that are problematic for encoding variability theories, including superadditivity and nonmonotonicity, and provides a unified theoretical framework for understanding the effects of repetition and the effects of associative relationships on memory. PMID:20580350

  10. Shape of Pion Distribution Amplitude

    SciTech Connect

    Radyushkin, Anatoly

    2009-11-01

    A scenario is investigated in which the leading-twist pion distribution amplitude $\\varphi_\\pi (x)$ is approximated by the pion decay constant $f_\\pi$ for all essential values of the light-cone fraction $x$. A model for the light-front wave function $\\Psi (x, k_\\perp)$ is proposed that produces such a distribution amplitude and has a rapidly decreasing (exponential for definiteness) dependence on the light-front energy combination $ k_\\perp^2/x(1-x)$. It is shown that this model easily reproduces the fit of recent large-$Q^2$ BaBar data on the photon-pion transition form factor. Some aspects of scenario with flat pion distribution amplitude are discussed.

  11. Degree distributions of growing networks.

    PubMed

    Krapivsky, P L; Rodgers, G J; Redner, S

    2001-06-01

    The in-degree and out-degree distributions of a growing network model are determined. The in-degree is the number of incoming links to a given node (and vice versa for out-degree). The network is built by (i) creation of new nodes which each immediately attach to a preexisting node, and (ii) creation of new links between preexisting nodes. This process naturally generates correlated in-degree and out-degree distributions. When the node and link creation rates are linear functions of node degree, these distributions exhibit distinct power-law forms. By tuning the parameters in these rates to reasonable values, exponents which agree with those of the web graph are obtained.

  12. Distributive Marketing Education: Innovative Instructional Techniques in Distributive Marketing Education.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    The conference featured more than 40 presentations representing existing and planned innovative programs in all levels of distributive marketing education in six States. In addition to the presentations (not reproduced in their entirety in the report), there were sessions and workshops for secondary, post secondary, and adult levels and for city…

  13. Distributed Weighted Stable Marriage Problem

    NASA Astrophysics Data System (ADS)

    Amira, Nir; Giladi, Ran; Lotker, Zvi

    The Stable Matching problem was introduced by Gale and Shapley in 1962. The input for the stable matching problem is a complete bipartite K n,n graph together with a ranking for each node. Its output is a matching that does not contain a blocking pair, where a blocking pair is a pair of elements that are not matched together but rank each other higher than they rank their current mates. In this work we study the Distributed Weighted Stable Matching problem. The input to the Weighted Stable Matching problem is a complete bipartite K n,n graph and a weight function W. The ranking of each node is determined by W, i.e. node v prefers node u 1 over node u 2 if W((v,u 1)) > W((v, u 2)). Using this ranking we can solve the original Stable Matching problem. We consider two different communication models: the billboard model and the full distributed model. In the billboard model, we assume that there is a public billboard and each participant can write one message on it in each time step. In the distributed model, we assume that each node can send O(logn) bits on each edge of the K n,n . In the billboard model we prove a somewhat surprising tight bound: any algorithm that solves the Stable Matching problem requires at least n - 1 rounds. We provide an algorithm that meets this bound. In the distributed communication model we provide an algorithm named intermediation agencies algorithm, in short (IAA), that solves the Distributed Weighted Stable Marriage problem in O(sqrt{n}) rounds. This is the first sub-linear distributed algorithm that solves some subcase of the general Stable Marriage problem.

  14. Cloud Distribution Statistics from LITE

    NASA Technical Reports Server (NTRS)

    Winker, David M.

    1998-01-01

    The Lidar In-Space Technology Experiment (LITE) mission has demonstrated the utility of spaceborne lidar in observing multilayer clouds and has provided a dataset showing the distribution of tropospheric clouds and aerosols. These unambiguous observations of the vertical distribution of clouds will allow improved verification of current cloud climatologies and GCM cloud parameterizations. Although there is now great interest in cloud profiling radar, operating in the mm-wave region, for the spacebased observation of cloud heights the results of the LITE mission have shown that satellite lidars can also make significant contributions in this area.

  15. Distributional preferences and competitive behavior.

    PubMed

    Balafoutas, Loukas; Kerschbamer, Rudolf; Sutter, Matthias

    2012-06-01

    We study experimentally the relationship between distributional preferences and competitive behavior. We find that spiteful subjects react strongest to competitive pressure and win in a tournament significantly more often than efficiency-minded and inequality averse subjects. However, when given the choice between a tournament and a piece rate scheme, efficiency-minded subjects choose the tournament most often, while spiteful and inequality averse subjects avoid it. When controlling for distributional preferences, risk attitudes and past performance, the gender gap in the willingness to compete is no longer significant, indicating that gender-related variables explain why twice as many men as women self-select into competition.

  16. Distribution of Galactic Dark Matter

    NASA Astrophysics Data System (ADS)

    Langton, Jonathan; Foss, Asa

    2001-04-01

    In this paper we examine the rotational curves of two dwarf spiral galaxies, NGC 2403 and NGC 3198. The observed rotation cannot be accounted for by luminous matter alone, therefore there must be a substantial dark component. We found the dark matter in both galaxies to be distributed according to the equation rho(r) = b*r/(r^2 + x^2). Combining this with a distribution of luminous matter rho(r)= rho(o)* e^-(a*r), we produced a rotation curve that matched the observed orbital velocities to within 4%.

  17. Modeling Nucleon Generalized Parton Distributions

    SciTech Connect

    Radyushkin, Anatoly V.

    2013-05-01

    We discuss building models for nucleon generalized parton distributions (GPDs) H and E that are based on the formalism of double distributions (DDs). We found that the usual "DD+D-term" construction should be amended by an extra term, xiE^1_+ (x,xi) built from the alpha/Beta moment of the DD e(Beta,alpha) that generates GPD E(x,xi). Unlike the D-term, this function has support in the whole -1< x<1 region, and in general does not vanish at the border points |x|=xi.

  18. Medium Effects in Parton Distributions

    SciTech Connect

    William Detmold, Huey-Wen Lin

    2011-12-01

    A defining experiment of high-energy physics in the 1980s was that of the EMC collaboration where it was first observed that parton distributions in nuclei are non-trivially related to those in the proton. This result implies that the presence of the nuclear medium plays an important role and an understanding of this from QCD has been an important goal ever since Here we investigate analogous, but technically simpler, effects in QCD and examine how the lowest moment of the pion parton distribution is modified by the presence of a Bose-condensed gas of pions or kaons.

  19. Current density distribution in PEFC

    NASA Astrophysics Data System (ADS)

    Liu, Zhixiang; Mao, Zongqiang; Wu, Bing; Wang, Lisheng; Schmidt, Volkmar M.

    The determination of the current distribution in a polymer electrolyte fuel cell (PEFC) is of great practical importance to optimize the process parameter such as the flow field design, the humidification of reaction gases and the utilization of the fuel gas. In this paper, subcells approach is used to measure current density distribution in PEFC with an active electrode area of 30 cm 2. Fuel cell performances determined under different operation conditions clearly indicate that the water balance influences the cell performance most significantly. Furthermore, it is interesting to note that under certain condition both membrane drying and electrode flooding are shown simultaneously inducing performance decaying.

  20. Shared versus distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The question of whether multiprocessors should have shared or distributed memory has attracted a great deal of attention. Some researchers argue strongly for building distributed memory machines, while others argue just as strongly for programming shared memory multiprocessors. A great deal of research is underway on both types of parallel systems. Special emphasis is placed on systems with a very large number of processors for computation intensive tasks and considers research and implementation trends. It appears that the two types of systems will likely converge to a common form for large scale multiprocessors.

  1. Saddlepoint distribution function approximations in biostatistical inference.

    PubMed

    Kolassa, J E

    2003-01-01

    Applications of saddlepoint approximations to distribution functions are reviewed. Calculations are provided for marginal distributions and conditional distributions. These approximations are applied to problems of testing and generating confidence intervals, particularly in canonical exponential families.

  2. Standard Distributions: One Graph Fits All

    ERIC Educational Resources Information Center

    Wagner, Clifford H.

    2007-01-01

    Standard distributions are ubiquitous but not unique. With suitable scaling, the graph of a standard distribution serves as the graph for every distribution in the family. The standard exponential can easily be taught in elementary statistics courses.

  3. Avoiding Distribution System Upgrade Costs Using Distributed Generation

    SciTech Connect

    Schienbein, Lawrence A.; Balducci, Patrick J.; Nguyen, Tony B.; Brown, Daryl R.; DeSteese, John G.; Speer, Gregory A.

    2004-01-20

    PNNL, in cooperation with three utilities, developed a database and methodology to analyze and characterize the avoided costs of Distributed Generation (DG) deployment as an alternative to traditional distribution system investment. After applying a number of screening criteria to the initial set of 307 cases, eighteen were selected for detailed analysis. Alternative DG investment scenarios were developed for these cases to permit capital, operation, maintenance, and fuel costs to be identified and incorporated into the analysis. The “customer-owned” backup power generator option was also investigated. The results of the analysis of the 18 cases show that none yielded cost savings under the alternative DG scenarios. However, the DG alternative systems were configured using very restrictive assumptions concerning reliability, peak rating, engine types and acceptable fuel. In particular it was assumed that the DG alternative in each case must meet the reliability required of conventional distribution systems (99.91% reliability). The analysis was further constrained by a requirement that each substation meet the demands placed upon it by a one in three weather occurrence. To determine if, by relaxing these requirements, the DG alternative might be more viable, one project was re-examined. The 99.91% reliability factor was still assumed for normal operating conditions but redundancy required to maintain reliability was relaxed for the relatively few hours every three years where extreme weather caused load to exceed present substation capacity. This resulted in the deferment of capital investment until later years and reduced the number of engines required for the project. The cost of both the conventional and DG alternative also dropped because the centralized power generation, variable O&M, and DG fuels costs were calculated based on present load requirements in combination with long-term forecasts of load growth, as opposed to load requirements plus a buffer

  4. Size Distribution of Bacterial Cells

    PubMed Central

    Stull, V. R.

    1972-01-01

    By using differential light-scattering measurements of single cells suspended in a laser beam, an effective cell radius has been determined for 141 individual bacteria from suspensions of Staphylococcus epidermidis. The accumulation of these measurements has provided the size distribution for the sampling. PMID:4551753

  5. Distributed Learning and Institutional Restructuring.

    ERIC Educational Resources Information Center

    Hawkins, Brian L.

    1999-01-01

    Discusses the following challenges institutions must consider as they enter the new marketplace of distributed learning: library access, faculty workload, faculty incentives, faculty-support structures, intellectual property, articulation agreements, financial aid, pricing, cross-subsidization of programs, institutional loyalty and philanthropy,…

  6. Sex Differences and Distributive Fairness.

    ERIC Educational Resources Information Center

    Russ, Terry Lee; Alexander, Sheldon

    In research on equity and justice some investigators have reported that men and women use different allocation norms in distributing rewards; men using an equity rule and women an equality rule, while others conclude that such sex differences in reward allocation appear primarily when the allocator is also a co-recipient of the reward. The present…

  7. Distributed user services for supercomputers

    NASA Technical Reports Server (NTRS)

    Sowizral, Henry A.

    1989-01-01

    User-service operations at supercomputer facilities are examined. The question is whether a single, possibly distributed, user-services organization could be shared by NASA's supercomputer sites in support of a diverse, geographically dispersed, user community. A possible structure for such an organization is identified as well as some of the technologies needed in operating such an organization.

  8. Age distribution of anginose mononucleosis.

    PubMed Central

    Spirer, Z; Holtzman, M; Melamed, I; Shalit, I

    1987-01-01

    The age distribution of anginose infectious mononucleosis in children was analysed retrospectively for the years 1966-85. During that period the disease became significantly more common in children of a young age and less common in older children. This shift could not be attributed either to socioeconomic conditions or to the diagnostic methods used. PMID:3619480

  9. Parallel, Distributed Scripting with Python

    SciTech Connect

    Miller, P J

    2002-05-24

    Parallel computers used to be, for the most part, one-of-a-kind systems which were extremely difficult to program portably. With SMP architectures, the advent of the POSIX thread API and OpenMP gave developers ways to portably exploit on-the-box shared memory parallelism. Since these architectures didn't scale cost-effectively, distributed memory clusters were developed. The associated MPI message passing libraries gave these systems a portable paradigm too. Having programmers effectively use this paradigm is a somewhat different question. Distributed data has to be explicitly transported via the messaging system in order for it to be useful. In high level languages, the MPI library gives access to data distribution routines in C, C++, and FORTRAN. But we need more than that. Many reasonable and common tasks are best done in (or as extensions to) scripting languages. Consider sysadm tools such as password crackers, file purgers, etc ... These are simple to write in a scripting language such as Python (an open source, portable, and freely available interpreter). But these tasks beg to be done in parallel. Consider the a password checker that checks an encrypted password against a 25,000 word dictionary. This can take around 10 seconds in Python (6 seconds in C). It is trivial to parallelize if you can distribute the information and co-ordinate the work.

  10. Career Information: Marketing and Distribution.

    ERIC Educational Resources Information Center

    American Vocational Association, Inc., Washington, DC.

    The publication is a bibliography prepared in an attempt to assist guidance and distributive education personnel in their task of securing relevant published career information. Depending on overall adequacy, three categories of the National Vocational Guidance Association (NVGA)--highly recommended, recommended, and useful--were used in rating…

  11. Modeling Natural Variation through Distribution

    ERIC Educational Resources Information Center

    Lehrer, Richard; Schauble, Leona

    2004-01-01

    This design study tracks the development of student thinking about natural variation as late elementary grade students learned about distribution in the context of modeling plant growth at the population level. The data-modeling approach assisted children in coordinating their understanding of particular cases with an evolving notion of data as an…

  12. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  13. Requiring Collaboration or Distributing Leadership?

    ERIC Educational Resources Information Center

    Kennedy, Anne; Deuel, Angie; Nelson, Tamara Holmlund; Slavit, David

    2011-01-01

    Through the process of initiating, implementing, and sustaining a schoolwide professional learning community (PLC), teachers and administrators at the pseudonymous Silver Valley Middle School provide a powerful example of distributed leadership in action. New leadership roles, coordination, and interdependency among staff have led to an increased…

  14. A distributed telerobotics construction set

    NASA Technical Reports Server (NTRS)

    Wise, James D.

    1994-01-01

    During the course of our research on distributed telerobotic systems, we have assembled a collection of generic, reusable software modules and an infrastructure for connecting them to form a variety of telerobotic configurations. This paper describes the structure of this 'Telerobotics Construction Set' and lists some of the components which comprise it.

  15. Distributed Leadership in Educational Institutions

    ERIC Educational Resources Information Center

    Göksoy, Süleyman

    2015-01-01

    In recent years, many studies are conducted about shared leadership process. Distributed leadership (DL) approach addresses leadership along with teams, groups and organizational characteristics. In practice, this approach objects the supposition that an individual should take the lead in order to ensure change. Proponents of this idea claim that…

  16. Prior Distributions on Symmetric Groups

    ERIC Educational Resources Information Center

    Gupta, Jayanti; Damien, Paul

    2005-01-01

    Fully and partially ranked data arise in a variety of contexts. From a Bayesian perspective, attention has focused on distance-based models; in particular, the Mallows model and extensions thereof. In this paper, a class of prior distributions, the "Binary Tree," is developed on the symmetric group. The attractive features of the class are: it…

  17. Educational Micropolitics and Distributed Leadership

    ERIC Educational Resources Information Center

    Flessa, Joseph

    2009-01-01

    This article critically reviews two bodies of literature that potentially share common concerns, yet rarely overlap: distributed leadership and educational micropolitics. Alternative explanations for the split between these two analytical approaches to school organization are explored in sections on problem framing, methodology, and the…

  18. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Wood, Mark; Cooper, Robert; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system is described: a collection of tools for constructing distributed application management software. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real time reactive program. The underlying application is instrumented with a variety of built-in and user defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when pre-existing, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  19. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Cooper, Robert; Wood, Mark; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system (a collection of tools for constructing distributed application management software) is described. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real-time reactive program. The underlying application is instrumented with a variety of built-in and user-defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when preexisting, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  20. Cooperative distributed architecture for mashups

    NASA Astrophysics Data System (ADS)

    Al-Haj Hassan, Osama Mohammad; Ramaswamy, Lakshmish; Hamad, Fadi; Abu Taleb, Anas

    2014-05-01

    Since the advent of Web 2.0, personalised applications such as mashups have become widely popular. Mashups enable end-users to fetch data from distributed data sources, and refine it based on their personal needs. This high degree of personalisation that mashups offer comes at the expense of performance and scalability. These scalability challenges are exacerbated by the centralised architectures of current mashup platforms. In this paper, we address the performance and scalability issues by designing CoMaP - a distributed mashup platform. CoMaP's architecture comprises of several cooperative mashup processing nodes distributed over the Internet upon which mashups can, fully or partially, be executed. CoMaP incorporates a dynamic and efficient scheme for deploying mashups on the processing nodes. Our scheme considers a number of parameters such as variations in link delays and bandwidths, and loads on mashup processing nodes. CoMaP includes effective and low-cost mechanisms for balancing loads on the processing nodes as well for handling node failures. Furthermore, we propose novel techniques that leverage keyword synonyms, ontologies and caching to enhance end-user experience. This paper reports several experiments to comprehensively study CoMaP's performance. The results demonstrate CoMaP's benefits as a scalable distributed mashup platform.

  1. Entropy content of multiplicity distributions

    NASA Astrophysics Data System (ADS)

    Lam, C. S.

    1989-02-01

    We argue that the entropy S is an important variable to consider for multiparticle productions. A prediction of the width parameter 1//k of multiplicity distributions can be made at superhigh energies by extrapolating the entropy variable S, considered as a function of the average multiplicity Nmacr;. This is done explicitly for the negative binomial distributions and the Furry-Yule distributions, though the method is applicable to other distributions. We also argue that direct extrapolation in the variable 1//k is not advisable. Further evidence for SSZ scaling is given, and a power law for the average multiplicity N¯ as a function of the collision energy √s is derived. It is a pleasure to thank Rudy Hwa and David Kiang for discussions and help. I am also grateful to Dr. V. Šimák and Dr. M. Šumbera for a correspondence pointing out a numerical error in the earlier version of this work. This work is supported in part by the Natural Science and Engineering Research Council of Canada and the Québec Department of Education.

  2. Distribute Education on the Upswing

    ERIC Educational Resources Information Center

    Brown, T. Carl

    1976-01-01

    This fragment of vocational education history reviews the early struggles of distributive education (DE) to win recognition from Congress, the public, and the retail industry, and goes on to describe the breakthrough that came with the formation of DE's National Management Advisory Council. (Editor/HD)

  3. Decoy State Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Lo, Hoi-Kwong

    2005-10-01

    Quantum key distribution (QKD) allows two parties to communicate in absolute security based on the fundamental laws of physics. Up till now, it is widely believed that unconditionally secure QKD based on standard Bennett-Brassard (BB84) protocol is limited in both key generation rate and distance because of imperfect devices. Here, we solve these two problems directly by presenting new protocols that are feasible with only current technology. Surprisingly, our new protocols can make fiber-based QKD unconditionally secure at distances over 100km (for some experiments, such as GYS) and increase the key generation rate from O(η2) in prior art to O(η) where η is the overall transmittance. Our method is to develop the decoy state idea (first proposed by W.-Y. Hwang in "Quantum Key Distribution with High Loss: Toward Global Secure Communication", Phys. Rev. Lett. 91, 057901 (2003)) and consider simple extensions of the BB84 protocol. This part of work is published in "Decoy State Quantum Key Distribution", . We present a general theory of the decoy state protocol and propose a decoy method based on only one signal state and two decoy states. We perform optimization on the choice of intensities of the signal state and the two decoy states. Our result shows that a decoy state protocol with only two types of decoy states--a vacuum and a weak decoy state--asymptotically approaches the theoretical limit of the most general type of decoy state protocols (with an infinite number of decoy states). We also present a one-decoy-state protocol as a special case of Vacuum+Weak decoy method. Moreover, we provide estimations on the effects of statistical fluctuations and suggest that, even for long distance (larger than 100km) QKD, our two-decoy-state protocol can be implemented with only a few hours of experimental data. In conclusion, decoy state quantum key distribution is highly practical. This part of work is

  4. The neutron star mass distribution

    SciTech Connect

    Kiziltan, Bülent; Kottas, Athanasios; De Yoreo, Maria; Thorsett, Stephen E.

    2013-11-20

    In recent years, the number of pulsars with secure mass measurements has increased to a level that allows us to probe the underlying neutron star (NS) mass distribution in detail. We critically review the radio pulsar mass measurements. For the first time, we are able to analyze a sizable population of NSs with a flexible modeling approach that can effectively accommodate a skewed underlying distribution and asymmetric measurement errors. We find that NSs that have evolved through different evolutionary paths reflect distinctive signatures through dissimilar distribution peak and mass cutoff values. NSs in double NS and NS-white dwarf (WD) systems show consistent respective peaks at 1.33 M {sub ☉} and 1.55 M {sub ☉}, suggesting significant mass accretion (Δm ≈ 0.22 M {sub ☉}) has occurred during the spin-up phase. The width of the mass distribution implied by double NS systems is indicative of a tight initial mass function while the inferred mass range is significantly wider for NSs that have gone through recycling. We find a mass cutoff at ∼2.1 M {sub ☉} for NSs with WD companions, which establishes a firm lower bound for the maximum NS mass. This rules out the majority of strange quark and soft equation of state models as viable configurations for NS matter. The lack of truncation close to the maximum mass cutoff along with the skewed nature of the inferred mass distribution both enforce the suggestion that the 2.1 M {sub ☉} limit is set by evolutionary constraints rather than nuclear physics or general relativity, and the existence of rare supermassive NSs is possible.

  5. Gaussian Velocity Distributions in Avalanches

    NASA Astrophysics Data System (ADS)

    Shattuck, Mark

    2004-03-01

    Imagine a world where gravity is so strong that if an ice cube is tilted the shear forces melt the surface and water avalanches down. Further imagine that the ambient temperature is so low that the water re-freezes almost immediately. This is the world of granular flows. As a granular solid is tilted the surface undergoes a sublimation phase transition and a granular gas avalanches down the surface, but the inelastic collisions rapidly remove energy from the flow lowering the granular temperature (kinetic energy per particle) until the gas solidifies again. It is under these extreme conditions that we attempt to uncover continuum granular flow properties. Typical continuum theories like Navier-Stokes equation for fluids follow the space-time evolution of the first few moments of the velocity distribution. We study continuously avalanching flow in a rotating two-dimensional granular drum using high-speed video imaging and extract the position and velocities of the particles. We find a universal near Gaussian velocity distribution throughout the flowing regions, which are characterized by a liquid-like radial distribution function. In the remaining regions, in which the radial distribution function develops sharp crystalline peaks, the velocity distribution has a Gaussian peak but is much broader in the tails. In a companion experiment on a vibrated two-dimensional granular fluid under constant pressure, we find a clear gas-solid phase transition in which both the temperature and density change discontinuously. This suggests that a low temperature crystal and a high temperature gas can coexist in steady state. This coexistence could result in a narrower, cooler, Gaussian peak and a broader, warmer, Gaussian tail like the non-Gaussian behavior seen in the crystalline portions of the rotating drum.

  6. Distributed intelligence in an astronomical Distributed Sensor Network

    NASA Astrophysics Data System (ADS)

    White, R. R.; Davis, H.; Vestrand, W. T.; Wozniak, P. R.

    2008-03-01

    The Telescope Alert Operations Network System (TALONS) was designed and developed in the year 2000, around the architectural principles of a distributed sensor network. This network supported the original Rapid Telescopes for Optical Response (RAPTOR) project goals; however, only with further development could TALONS meet the goals of the larger Thinking Telescope Project. The complex objectives of the Thinking Telescope project required a paradigm shift in the software architecture - the centralised intelligence merged into the TALONS network operations could no longer meet all of the new requirements. The intelligence needed to be divorced from the network operations and developed as a series of peripheral intelligent agents, distributing the decision making and analytical processes based on the temporal volatility of the data. This paper is presented as only one part of the poster from the workshop and in it we will explore the details of this architecture and how that merges with the current Thinking Telescope system to meet our project goals.

  7. Mixture of Skewed α-Stable Distributions

    NASA Astrophysics Data System (ADS)

    Shojaei, S. R. Hosseini; Nassiri, V.; Mohammadian, Gh. R.; Mohammadpour, A.

    2011-03-01

    Expectation maximization (EM) algorithm and the Bayesian techniques are two approaches for statistical inference of mixture models [3, 4]. By noting the advantages of the Bayesian methods, practitioners prefer them. However, implementing Markov chain Monte Carlo algorithms can be very complicated for stable distributions, due to the non-analytic density or distribution function formulas. In this paper, we introduce a new class of mixture of heavy-tailed distributions, called mixture of skewed stable distributions. Skewed stable distributions belongs to the exponential family and they have analytic density representation. It is shown that skewed stable distributions dominate skew stable distribution functions and they can be used to model heavy-tailed data. The class of skewed stable distributions has an analytic representation for its density function and the Bayesian inference can be done similar to the exponential family of distributions. Finally, mixture of skewed stable distributions are compared to the mixture of stable distributions through a simulations study.

  8. 26 CFR 1.643(a)-0 - Distributable net income; deduction for distributions; in general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... § 1.643(a)-0 Distributable net income; deduction for distributions; in general. The term distributable... character of distributions to the beneficiaries. Distributable net income means for any taxable year, the... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Distributable net income; deduction...

  9. Distribution Integrity Management Plant (DIMP)

    SciTech Connect

    Gonzales, Jerome F.

    2012-05-07

    This document is the distribution integrity management plan (Plan) for the Los Alamos National Laboratory (LANL) Natural Gas Distribution System. This Plan meets the requirements of 49 CFR Part 192, Subpart P Distribution Integrity Management Programs (DIMP) for the LANL Natural Gas Distribution System. This Plan was developed by reviewing records and interviewing LANL personnel. The records consist of the design, construction, operation and maintenance for the LANL Natural Gas Distribution System. The records system for the LANL Natural Gas Distribution System is limited, so the majority of information is based on the judgment of LANL employees; the maintenance crew, the Corrosion Specialist and the Utilities and Infrastructure (UI) Civil Team Leader. The records used in this report are: Pipeline and Hazardous Materials Safety Administration (PHMSA) 7100.1-1, Report of Main and Service Line Inspection, Natural Gas Leak Survey, Gas Leak Response Report, Gas Leak and Repair Report, and Pipe-to-Soil Recordings. The specific elements of knowledge of the infrastructure used to evaluate each threat and prioritize risks are listed in Sections 6 and 7, Threat Evaluation and Risk Prioritization respectively. This Plan addresses additional information needed and a method for gaining that data over time through normal activities. The processes used for the initial assessment of Threat Evaluation and Risk Prioritization are the methods found in the Simple, Handy Risk-based Integrity Management Plan (SHRIMP{trademark}) software package developed by the American Pipeline and Gas Agency (APGA) Security and Integrity Foundation (SIF). SHRIMP{trademark} uses an index model developed by the consultants and advisors of the SIF. Threat assessment is performed using questions developed by the Gas Piping Technology Company (GPTC) as modified and added to by the SHRIMP{trademark} advisors. This Plan is required to be reviewed every 5 years to be continually refined and improved. Records

  10. Enhanced distributed energy resource system

    DOEpatents

    Atcitty, Stanley; Clark, Nancy H.; Boyes, John D.; Ranade, Satishkumar J.

    2007-07-03

    A power transmission system including a direct current power source electrically connected to a conversion device for converting direct current into alternating current, a conversion device connected to a power distribution system through a junction, an energy storage device capable of producing direct current connected to a converter, where the converter, such as an insulated gate bipolar transistor, converts direct current from an energy storage device into alternating current and supplies the current to the junction and subsequently to the power distribution system. A microprocessor controller, connected to a sampling and feedback module and the converter, determines when the current load is higher than a set threshold value, requiring triggering of the converter to supply supplemental current to the power transmission system.

  11. Hazards Data Distribution System (HDDS)

    USGS Publications Warehouse

    Jones, Brenda; Lamb, Rynn M.

    2015-07-09

    When emergencies occur, first responders and disaster response teams often need rapid access to aerial photography and satellite imagery that is acquired before and after the event. The U.S. Geological Survey (USGS) Hazards Data Distribution System (HDDS) provides quick and easy access to pre- and post-event imagery and geospatial datasets that support emergency response and recovery operations. The HDDS provides a single, consolidated point-of-entry and distribution system for USGS-hosted remotely sensed imagery and other geospatial datasets related to an event response. The data delivery services are provided through an interactive map-based interface that allows emergency response personnel to rapidly select and download pre-event ("baseline") and post-event emergency response imagery.

  12. Nation Radiobiology Archives Distributed Access

    SciTech Connect

    Smith, S. K.; Prather, J. C.; Ligotke, E. K.; Watson, C. R.

    1992-06-01

    NRADA1.1 is a supplement to NRADA1.0. This version eliminates several bugs, and includes a few new features. The diskettes consist of a distributed subset of information representative of the extensive NRA databases and database access software maintained at the Pacific Northwest Laboratory which provide an introduction to the scope and style of the NRA Information Systems. Information in the NRA Summary, Inventory, and Bibliographic database is available upon request. Printed reports have been provided in the past. The completion of the NRADA1.1 is the realization of a long standing goal of the staff and advisory committee. Information may be easily distributed to the user in an electronic form which preserves the relationships between the various databases.

  13. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  14. Estimators for the Cauchy distribution

    SciTech Connect

    Hanson, K.M.; Wolf, D.R.

    1993-12-31

    We discuss the properties of various estimators of the central position of the Cauchy distribution. The performance of these estimators is evaluated for a set of simulated experiments. Estimators based on the maximum and mean of the posterior probability density function are empirically found to be well behaved when more than two measurements are available. On the contrary, because of the infinite variance of the Cauchy distribution, the average of the measured positions is an extremely poor estimator of the location of the source. However, the median of the measured positions is well behaved. The rms errors for the various estimators are compared to the Fisher-Cramer-Rao lower bound. We find that the square root of the variance of the posterior density function is predictive of the rms error in the mean posterior estimator.

  15. Antenna structure with distributed strip

    DOEpatents

    Rodenbeck, Christopher T.

    2008-10-21

    An antenna comprises electrical conductors arranged to form a radiating element including a folded line configuration and a distributed strip configuration, where the radiating element is in proximity to a ground conductor. The folded line and the distributed strip can be electrically interconnected and substantially coplanar. The ground conductor can be spaced from, and coplanar to, the radiating element, or can alternatively lie in a plane set at an angle to the radiating element. Embodiments of the antenna include conductor patterns formed on a printed wiring board, having a ground plane, spacedly adjacent to and coplanar with the radiating element. Other embodiments of the antenna comprise a ground plane and radiating element on opposed sides of a printed wiring board. Other embodiments of the antenna comprise conductors that can be arranged as free standing "foils". Other embodiments include antennas that are encapsulated into a package containing the antenna.

  16. Antenna structure with distributed strip

    DOEpatents

    Rodenbeck, Christopher T.

    2008-03-18

    An antenna comprises electrical conductors arranged to form a radiating element including a folded line configuration and a distributed strip configuration, where the radiating element is in proximity to a ground conductor. The folded line and the distributed strip can be electrically interconnected and substantially coplanar. The ground conductor can be spaced from, and coplanar to, the radiating element, or can alternatively lie in a plane set at an angle to the radiating element. Embodiments of the antenna include conductor patterns formed on a printed wiring board, having a ground plane, spacedly adjacent to and coplanar with the radiating element. Other embodiments of the antenna comprise a ground plane and radiating element on opposed sides of a printed wiring board. Other embodiments of the antenna comprise conductors that can be arranged as free standing "foils". Other embodiments include antennas that are encapsulated into a package containing the antenna.

  17. Nation Radiobiology Archives Distributed Access

    1992-06-01

    NRADA1.1 is a supplement to NRADA1.0. This version eliminates several bugs, and includes a few new features. The diskettes consist of a distributed subset of information representative of the extensive NRA databases and database access software maintained at the Pacific Northwest Laboratory which provide an introduction to the scope and style of the NRA Information Systems. Information in the NRA Summary, Inventory, and Bibliographic database is available upon request. Printed reports have been provided inmore » the past. The completion of the NRADA1.1 is the realization of a long standing goal of the staff and advisory committee. Information may be easily distributed to the user in an electronic form which preserves the relationships between the various databases.« less

  18. Digitally controlled distributed phase shifter

    SciTech Connect

    Hietala, V.M.; Kravitz, S.H.; Vawter, G.A.

    1992-12-31

    A digitally controlled distributed phase shifter is comprised of N phase shifters. Digital control is achieved by using N binary length-weighted electrodes located on the top surface of a waveguide. A control terminal is attached to each electrode thereby allowing the application of a control signal. The control signal is either one of two discrete bias voltages. The application of the discrete bias voltages change the modal index of a portion of the waveguide that corresponds to a length of the electrode to which the bias voltage is applied, thereby causing the phase to change through the underlying portion of the waveguide. The digitally controlled distributed phase shift network has a total phase shift comprised of the sum of the individual phase shifters.

  19. Distributed antenna system and method

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Dobbins, Justin A. (Inventor)

    2004-01-01

    System and methods are disclosed for employing one or more radiators having non-unique phase centers mounted to a body with respect to a plurality of transmitters to determine location characteristics of the body such as the position and/or attitude of the body. The one or more radiators may consist of a single, continuous element or of two or more discrete radiation elements whose received signals are combined. In a preferred embodiment, the location characteristics are determined using carrier phase measurements whereby phase center information may be determined or estimated. A distributed antenna having a wide angle view may be mounted to a moveable body in accord with the present invention. The distributed antenna may be utilized for maintaining signal contact with multiple spaced apart transmitters, such as a GPS constellation, as the body rotates without the need for RF switches to thereby provide continuous attitude and position determination of the body.

  20. Chemical elements distribution in cells

    NASA Astrophysics Data System (ADS)

    Ortega, R.

    2005-04-01

    Analysing, imaging and understanding the cellular chemistry, from macromolecules to monoatomic elements, is probably a major challenge for the scientific community after the conclusion of the genome project. In order to probe the distribution of elements in cells, especially the so-called inorganic elements, it is necessary to apply microanalytical techniques with sub-micrometer resolution and high chemical sensitivity. This paper presents the current status of chemical element imaging inside cells, and a comparison of the different analytical techniques available: nuclear microprobe, electron microprobe and electron energy loss spectroscopy, synchrotron radiation microprobe, secondary ion mass spectrometry and fluorescence microscopy methods. Examples of intracellular chemical elements distributions relevant to cancer pharmacology, medical imaging, metal carcinogenesis and neuropathology studies obtained by nuclear microprobe and other microanalytical techniques are presented.

  1. Secure key storage and distribution

    DOEpatents

    Agrawal, Punit

    2015-06-02

    This disclosure describes a distributed, fault-tolerant security system that enables the secure storage and distribution of private keys. In one implementation, the security system includes a plurality of computing resources that independently store private keys provided by publishers and encrypted using a single security system public key. To protect against malicious activity, the security system private key necessary to decrypt the publication private keys is not stored at any of the computing resources. Rather portions, or shares of the security system private key are stored at each of the computing resources within the security system and multiple security systems must communicate and share partial decryptions in order to decrypt the stored private key.

  2. Parameterizing the Raindrop Size Distribution

    NASA Technical Reports Server (NTRS)

    Haddad, Ziad S.; Durden, Stephen L.; Im, Eastwood

    1996-01-01

    This paper addresses the problem of finding a parametric form for the raindrop size distribution (DSD) that(1) is an appropriate model for tropical rainfall, and (2) involves statistically independent parameters. Such a parameterization is derived in this paper. One of the resulting three "canonical" parameters turns out to vary relatively little, thus making the parameterization particularly useful for remote sensing applications. In fact, a new set of r drop-size-distribution-based Z-R and k-R relations is obtained. Only slightly more complex than power laws, they are very good approximations to the exact radar relations one would obtain using Mie scattering. The coefficients of the new relations are directly related to the shape parameters of the particular DSD that one starts with. Perhaps most important, since the coefficients are independent of the rain rate itself, the relations are ideally suited for rain retrieval algorithms.

  3. Distribution-free discriminant analysis

    SciTech Connect

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  4. SAMICS marketing and distribution model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A SAMICS (Solar Array Manufacturing Industry Costing Standards) was formulated as a computer simulation model. Given a proper description of the manufacturing technology as input, this model computes the manufacturing price of solar arrays for a broad range of production levels. This report presents a model for computing these marketing and distribution costs, the end point of the model being the loading dock of the final manufacturer.

  5. Thermophoretic depletion follows Boltzmann distribution.

    PubMed

    Duhr, Stefan; Braun, Dieter

    2006-04-28

    Thermophoresis, also termed thermal diffusion or the Soret effect, moves particles along temperature gradients. For particles in liquids, the effect lacks a theoretical explanation. We present experimental results at moderate thermal gradients: (i) Thermophoretic depletion of 200 nm polystyrene spheres in water follows an exponential distribution over 2 orders of magnitude in concentration; (ii) Soret coefficients scale linearly with the sphere's surface area. Based on the experiments, it is argued that local thermodynamic equilibrium is a good starting point to describe thermophoresis.

  6. The data distribution satellite system

    NASA Technical Reports Server (NTRS)

    Bruno, Ronald C.; Weinberg, Aaron

    1991-01-01

    The Data Distributed Satellite (DDS) will be capable of providing the space research community with inexpensive and easy access to space payloads and space data. Furthermore, the DDS is shown to be a natural outgrowth of advances and evolution in both NASA's Space Network and commercial satellite communications. The roadmap and timescale for this evolution is described along with key demonstrations, proof-of-concept models, and required technology development that will support the projected system evolution toward the DDS.

  7. 30 CFR 57.12006 - Distribution boxes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Distribution boxes. 57.12006 Section 57.12006... and Underground § 57.12006 Distribution boxes. Distribution boxes shall be provided with a... deenergized, and the distribution box shall be labeled to show which circuit each device controls....

  8. 30 CFR 57.12006 - Distribution boxes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Distribution boxes. 57.12006 Section 57.12006... and Underground § 57.12006 Distribution boxes. Distribution boxes shall be provided with a... deenergized, and the distribution box shall be labeled to show which circuit each device controls....

  9. 30 CFR 57.12006 - Distribution boxes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Distribution boxes. 57.12006 Section 57.12006... and Underground § 57.12006 Distribution boxes. Distribution boxes shall be provided with a... deenergized, and the distribution box shall be labeled to show which circuit each device controls....

  10. 14 CFR 25.1355 - Distribution system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Equipment Electrical Systems and Equipment § 25.1355 Distribution system. (a) The distribution system includes the distribution busses, their associated feeders... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Distribution system. 25.1355 Section...

  11. 14 CFR 29.1355 - Distribution system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Equipment Electrical Systems and Equipment § 29.1355 Distribution system. (a) The distribution system includes the distribution busses, their associated feeders... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Distribution system. 29.1355 Section...

  12. 14 CFR 29.1355 - Distribution system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Equipment Electrical Systems and Equipment § 29.1355 Distribution system. (a) The distribution system includes the distribution busses, their associated feeders... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Distribution system. 29.1355 Section...

  13. 14 CFR 25.1355 - Distribution system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Equipment Electrical Systems and Equipment § 25.1355 Distribution system. (a) The distribution system includes the distribution busses, their associated feeders... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Distribution system. 25.1355 Section...

  14. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  15. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.

  16. Statistical Physics for Adaptive Distributed Control

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.

  17. Redshifts distribution in A262

    NASA Astrophysics Data System (ADS)

    Hassan, M. S. R.; Abidin, Z. Z.; Ibrahim, U. F. S. U.; Hashim, N.; Lee, D. A. A.

    2016-05-01

    Galaxy clusters are the largest virialized systems in the Universe containing a collection of galaxies of different redshifts. The redshift distribution of galaxies in galaxy clusters is concentrated at a certain redshift range which remarkably tells us that only the galaxies in a certain radial range belong to the galaxy cluster. This leads to a boundary estimation of the cluster. Background and foreground systems are represented by a histogram that determines whether some of the galaxies are too far or too high in redshift to be counted as the member of the cluster. With the recent advances in multifibre spectroscopy, it has become possible to perform detailed analysis of the redshift distribution of several galaxy clusters in the Abell Catalogue. This has given rise to significantly improved estimates of cluster membership, extent and dynamical history. Here we present a spectroscopic analysis of the galaxy cluster A262. We find 55 galaxies fall within z = 0.0143 and 0.0183 with velocity range 4450-5300 km s-1, and are therefore members of the cluster. We derived a new mean redshift of z = 0.016 173 ± 0.000 074 (4852 ± 22 km s-1) for the system of which we compare with our neutral hydrogen (H I) detection which peaks at 4970 ± 0.5 km s-1. It is found that the distribution of H I tends to be located at the edge of the cluster since most of spiral rich galaxies were away from cluster centre.

  18. Distributed Relaxation for Conservative Discretizations

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2001-01-01

    A multigrid method is defined as having textbook multigrid efficiency (TME) if the solutions to the governing system of equations are attained in a computational work that is a small (less than 10) multiple of the operation count in one target-grid residual evaluation. The way to achieve this efficiency is the distributed relaxation approach. TME solvers employing distributed relaxation have already been demonstrated for nonconservative formulations of high-Reynolds-number viscous incompressible and subsonic compressible flow regimes. The purpose of this paper is to provide foundations for applications of distributed relaxation to conservative discretizations. A direct correspondence between the primitive variable interpolations for calculating fluxes in conservative finite-volume discretizations and stencils of the discretized derivatives in the nonconservative formulation has been established. Based on this correspondence, one can arrive at a conservative discretization which is very efficiently solved with a nonconservative relaxation scheme and this is demonstrated for conservative discretization of the quasi one-dimensional Euler equations. Formulations for both staggered and collocated grid arrangements are considered and extensions of the general procedure to multiple dimensions are discussed.

  19. Infrastructure for distributed enterprise simulation

    SciTech Connect

    Johnson, M.M.; Yoshimura, A.S.; Goldsby, M.E.

    1998-01-01

    Traditional discrete-event simulations employ an inherently sequential algorithm and are run on a single computer. However, the demands of many real-world problems exceed the capabilities of sequential simulation systems. Often the capacity of a computer`s primary memory limits the size of the models that can be handled, and in some cases parallel execution on multiple processors could significantly reduce the simulation time. This paper describes the development of an Infrastructure for Distributed Enterprise Simulation (IDES) - a large-scale portable parallel simulation framework developed to support Sandia National Laboratories` mission in stockpile stewardship. IDES is based on the Breathing-Time-Buckets synchronization protocol, and maps a message-based model of distributed computing onto an object-oriented programming model. IDES is portable across heterogeneous computing architectures, including single-processor systems, networks of workstations and multi-processor computers with shared or distributed memory. The system provides a simple and sufficient application programming interface that can be used by scientists to quickly model large-scale, complex enterprise systems. In the background and without involving the user, IDES is capable of making dynamic use of idle processing power available throughout the enterprise network. 16 refs., 14 figs.

  20. Distributed Virtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in the continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC 2-539, the investigators are developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; developing communications routines that support the abstractions implemented; continuing the development of file and information systems based on the Virtual System Model; and incorporating appropriate security measures to allow the mechanisms developed to be used on an open network. The goal throughout the work is to provide a uniform model that can be applied to both parallel and distributed systems. The authors believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. The work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  1. Distributed Virtual System (DIVIRS) Project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on contract NCC 2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to program parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the virtual system model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  2. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, Clifford B.

    1995-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  3. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1994-01-01

    As outlined in our continuation proposal 92-ISI-. OR (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  4. Energy conservation in electric distribution

    SciTech Connect

    Lee, Chong-Jin

    1994-12-31

    This paper discusses the potential for energy and power savings that exist in electric power delivery systems. These savings translate into significant financial and environmental benefits for electricity producers and consumers as well as for society in general. AlliedSignal`s knowledge and perspectives on this topic are the result of discussions with hundreds of utility executives, government officials and other industry experts over the past decade in conjunction with marketing our Amorphous Metal technology for electric distribution transformers. Amorphous metal is a technology developed by AlliedSignal that significantly reduces the energy lost in electric distribution transformers at an incremental cost of just a few cents per kilo-Watt-hour. The purpose of this paper is to discuss: Amorphous Metal Alloy Technology; Energy Savings Opportunity; The Industrial Barriers and Remedies; Worldwide Demand; and A Low Risk Strategy. I wish this presentation will help KEPCO achieve their stated aims of ensuring sound development of the national economy and enhancement of public life through the economic and stable supply of electric power. AlliedSignal Korea Ltd. in conjunction with AlliedSignal Amorphous Metals in the U.S. are here to work with KEPCO, transformer manufacturers, industry, and government agencies to achieve greater efficiency in power distribution.

  5. Preliminary Iron Distribution on Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Mittlefehldt, David W.

    2013-01-01

    The distribution of iron on the surface of the asteroid Vesta was investigated using Dawn's Gamma Ray and Neutron Detector (GRaND) [1,2]. Iron varies predictably with rock type for the howardite, eucrite, and diogenite (HED) meteorites, thought to be representative of Vesta. The abundance of Fe in howardites ranges from about 12 to 15 wt.%. Basaltic eucrites have the highest abundance, whereas, lower crustal and upper mantle materials (cumulate eucrites and diogenites) have the lowest, and howardites are intermediate [3]. We have completed a mapping study of 7.6 MeV gamma rays produced by neutron capture by Fe as measured by the bismuth germanate (BGO) detector of GRaND [1]. The procedures to determine Fe counting rates are presented in detail here, along with a preliminary distribution map, constituting the necessary initial step to quantification of Fe abundances. We find that the global distribution of Fe counting rates is generally consistent with independent mineralogical and compositional inferences obtained by other instruments on Dawn such as measurements of pyroxene absorption bands by the Visual and Infrared Spectrometer (VIR) [4] and Framing Camera (FC) [5] and neutron absorption measurements by GRaND [6].

  6. Jefferson Lab's Distributed Data Acquisition

    SciTech Connect

    Trent Allison; Thomas Powers

    2006-05-01

    Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF) occasionally experiences fast intermittent beam instabilities that are difficult to isolate and result in downtime. The Distributed Data Acquisition (Dist DAQ) system is being developed to detect and quickly locate such instabilities. It will consist of multiple Ethernet based data acquisition chassis distributed throughout the seven-eights of a mile CEBAF site. Each chassis will monitor various control system signals that are only available locally and/or monitored by systems with small bandwidths that cannot identify fast transients. The chassis will collect data at rates up to 40 Msps in circular buffers that can be frozen and unrolled after an event trigger. These triggers will be derived from signals such as periodic timers or accelerator faults and be distributed via a custom fiber optic event trigger network. This triggering scheme will allow all the data acquisition chassis to be triggered simultaneously and provide a snapshot of relevant CEBAF control signals. The data will then be automatically analyzed for frequency content and transients to determine if and where instabilities exist.

  7. Problem solving in a distributed environment

    NASA Technical Reports Server (NTRS)

    Rashid, R. F.

    1980-01-01

    Distributed problem solving is anayzed as a blend of two disciplines: (1) problem solving and ai; and (2) distributed systems (monitoring). It may be necessary to distribute because the application itself is one of managing distributed resources (e.g., distributed sensor net) and communication delays preclude centralized processing, or it may be desirable to distribute because a single computational engine may not satisfy the needs of a given task. In addition, considerations of reliability may dictate distribution. Examples of multi-process language environment are given.

  8. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  9. A distributed clients/distributed servers model for STARCAT

    NASA Technical Reports Server (NTRS)

    Pirenne, B.; Albrecht, M. A.; Durand, D.; Gaudet, S.

    1992-01-01

    STARCAT, the Space Telescope ARchive and CATalogue user interface has been along for a number of years already. During this time it has been enhanced and augmented in a number of different fields. This time, we would like to dwell on a new capability allowing geographically distributed user interfaces to connect to geographically distributed data servers. This new concept permits users anywhere on the internet running STARCAT on their local hardware to access e.g., whichever of the 3 existing HST archive sites is available, or get information on the CFHT archive through a transparent connection to the CADC in BC or to get the La Silla weather by connecting to the ESO database in Munich during the same session. Similarly PreView (or quick look) images and spectra will also flow directly to the user from wherever it is available. Moving towards an 'X'-based STARCAT is another goal being pursued: a graphic/image server and a help/doc server are currently being added to it. They should further enhance the user independence and access transparency.

  10. Triple junction distributions in polycrystals

    SciTech Connect

    King, W.E.; Kumar, M.; Schwartz, A.J.

    1999-07-01

    Recently, it has been demonstrated that some material properties can be enhanced by grain boundary engineering, i.e., the systematic modifications in the topology of the microstructure through thermomechanical processing. Experimental observations have shown that the microstructural parameter likely responsible for improved properties is the grain boundary character distribution (GBCD). It has been suggested that improvements in the fractions of special boundaries as defined by the coincident site lattice model (1) are necessary, but not fully sufficient to cause property improvements. For example, it has been observed that cracks propagating along interconnected networks of random grain boundaries can be arrested (2) when intersecting a triple junction where the remaining two pathways are special boundaries. Therefore, it is of interest to characterize microstructures in terms of the distributions of triple junction types. A simple method to describe a triple junction is by the types of grain boundaries intersecting at that junction [special vs. random, as described by the coincident site lattice (CSL) model]. The distribution of 0-CSL, 1-CSL, 2-CSL and 3-CSL triple junctions in the microstructure can then be plotted as a function of the fraction of special boundaries. Such data has been collected using orientation-imaging microscopy (OIM) (3--5) for oxygen-free-electronic (ofe)-Cu and Inconel 600 over a range of special fraction of grain boundaries. These results have been compared with theoretical models considering isolated triple junctions and invoking the {Sigma}-product rule (6) where {Sigma} is the inverse density of coincident lattice sites (7).

  11. Adaptive reconfigurable distributed sensor architecture

    NASA Astrophysics Data System (ADS)

    Akey, Mark L.

    1997-07-01

    The infancy of unattended ground based sensors is quickly coming to an end with the arrival of on-board GPS, networking, and multiple sensing capabilities. Unfortunately, their use is only first-order at best: GPS assists with sensor report registration; networks push sensor reports back to the warfighter and forwards control information to the sensors; multispectral sensing is a preset, pre-deployment consideration; and the scalability of large sensor networks is questionable. Current architectures provide little synergy among or within the sensors either before or after deployment, and do not map well to the tactical user's organizational structures and constraints. A new distributed sensor architecture is defined which moves well beyond single sensor, single task architectures. Advantages include: (1) automatic mapping of tactical direction to multiple sensors' tasks; (2) decentralized, distributed management of sensor resources and tasks; (3) software reconfiguration of deployed sensors; (4) network scalability and flexibility to meet the constraints of tactical deployments, and traditional combat organizations and hierarchies; and (5) adaptability to new battlefield communication paradigms such as BADD (Battlefield Analysis and Data Dissemination). The architecture is supported in two areas: a recursive, structural definition of resource configuration and management via loose associations; and a hybridization of intelligent software agents with tele- programming capabilities. The distributed sensor architecture is examined within the context of air-deployed ground sensors with acoustic, communication direction finding, and infra-red capabilities. Advantages and disadvantages of the architecture are examined. Consideration is given to extended sensor life (up to 6 months), post-deployment sensor reconfiguration, limited on- board sensor resources (processor and memory), and bandwidth. It is shown that technical tasking of the sensor suite can be automatically

  12. Environmental distribution of prokaryotic taxa

    PubMed Central

    2010-01-01

    Background The increasing availability of gene sequences of prokaryotic species in samples extracted from all kind of locations allows addressing the study of the influence of environmental patterns in prokaryotic biodiversity. We present a comprehensive study to address the potential existence of environmental preferences of prokaryotic taxa and the commonness of the specialist and generalist strategies. We also assessed the most significant environmental factors shaping the environmental distribution of taxa. Results We used 16S rDNA sequences from 3,502 sampling experiments in natural and artificial sources. These sequences were taxonomically assigned, and the corresponding samples were also classified into a hierarchical classification of environments. We used several statistical methods to analyze the environmental distribution of taxa. Our results indicate that environmental specificity is not very common at the higher taxonomic levels (phylum to family), but emerges at lower taxonomic levels (genus and species). The most selective environmental characteristics are those of animal tissues and thermal locations. Salinity is another very important factor for constraining prokaryotic diversity. On the other hand, soil and freshwater habitats are the less restrictive environments, harboring the largest number of prokaryotic taxa. All information on taxa, samples and environments is provided at the envDB online database, http://metagenomics.uv.es/envDB. Conclusions This is, as far as we know, the most comprehensive assessment of the distribution and diversity of prokaryotic taxa and their associations with different environments. Our data indicate that we are still far from characterizing prokaryotic diversity in any environment, except, perhaps, for human tissues such as the oral cavity and the vagina. PMID:20307274

  13. The pulsar spectral index distribution

    NASA Astrophysics Data System (ADS)

    Bates, S. D.; Lorimer, D. R.; Verbiest, J. P. W.

    2013-05-01

    The flux-density spectra of radio pulsars are known to be steep and, to first order, described by a power-law relationship of the form Sν ∝ να, where Sν is the flux density at some frequency ν and α is the spectral index. Although measurements of α have been made over the years for several hundred pulsars, a study of the intrinsic distribution of pulsar spectra has not been carried out. From the result of pulsar surveys carried out at three different radio frequencies, we use population synthesis techniques and a likelihood analysis to deduce what underlying spectral index distribution is required to replicate the results of these surveys. We find that in general the results of the surveys can be modelled by a Gaussian distribution of spectral indices with a mean of -1.4 and unit standard deviation. We also consider the impact of the so-called gigahertz-peaked spectrum pulsars proposed by Kijak et al. The fraction of peaked-spectrum sources in the population with any significant turnover at low frequencies appears to be at most 10 per cent. We demonstrate that high-frequency (>2 GHz) surveys preferentially select flatter spectrum pulsars and the converse is true for lower frequency (<1 GHz) surveys. This implies that any correlations between α and other pulsar parameters (for example age or magnetic field) need to carefully account for selection biases in pulsar surveys. We also expect that many known pulsars which have been detected at high frequencies will have shallow, or positive, spectral indices. The majority of pulsars do not have recorded flux density measurements over a wide frequency range, making it impossible to constrain their spectral shapes. We also suggest that such measurements would allow an improved description of any populations of pulsars with `non-standard' spectra. Further refinements to this picture will soon be possible from the results of surveys with the Green Bank Telescope and LOFAR.

  14. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  15. Video distribution system cost model

    NASA Technical Reports Server (NTRS)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  16. Distributed optimization system and method

    DOEpatents

    Hurtado, John E.; Dohrmann, Clark R.; Robinett, III, Rush D.

    2003-06-10

    A search system and method for controlling multiple agents to optimize an objective using distributed sensing and cooperative control. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace. The objective can be: chemical sources, temperature sources, radiation sources, light sources, evaders, trespassers, explosive sources, time dependent sources, time independent sources, function surfaces, maximization points, minimization points, and optimal control of a system such as a communication system, an economy, a crane, and a multi-processor computer.

  17. Ubiquitous distributed objects with CORBA.

    PubMed

    Achard, F; Barillot, E

    1997-01-01

    Database interoperation is becoming a bottleneck for the research community in biology. In this paper, we first discuss the question of interoperability and give a brief overview of CORBA. Then, an example is explained in some detail: a simple but realistic data bank of STSs is implemented. The Object Request Broker is the media for communication between an object server (the data bank) and a client (possibly a genome center). Since CORBA enables easy development of networked applications, we meant this paper to provide an incentive for the bioinformatics community to develop distributed objects.

  18. Synchronous Sampling for Distributed Experiments

    NASA Astrophysics Data System (ADS)

    Wittkamp, M.; Ettl, J.

    2015-09-01

    Sounding Rocket payloads, especially for atmospheric research, often consists of several independent sensors or experiments with different objectives. The data of these sensors can be combined in the post processing to improve the scientific results of the flight. One major requirement for this data-correlation is a common timeline for the measurements of the distributed experiments. Within this paper we present two ways to achieve absolute timing for asynchronously working experiments. The synchronization process is using the Global Positioning System (GPS) and a standard serial communication protocol for transport of timestamps and flight-states.

  19. The CJ12 parton distributions

    SciTech Connect

    Accardi, Alberto; Owens, Jeff F.

    2013-07-01

    Three new sets of next-to-leading order parton distribution functions (PDFs) are presented, determined by global fits to a wide variety of data for hard scattering processes. The analysis includes target mass and higher twist corrections needed for the description of deep-inelastic scattering data at large x and low Q^2, and nuclear corrections for deuterium targets. The PDF sets correspond to three different models for the nuclear effects, and provide a more realistic uncertainty range for the d quark PDF compared with previous fits. Applications to weak boson production at colliders are also discussed.

  20. Distributions of nonsupersymmetric flux vacua

    NASA Astrophysics Data System (ADS)

    Denef, Frederik; Douglas, Michael R.

    2005-03-01

    We continue the study of the distribution of nonsupersymmetric flux vacua in IIb string theory compactified on Calabi-Yau manifolds, as in hep-th/0404116. We show that the basic structure of this problem is that of finding eigenvectors of the matrix of second derivatives of the superpotential, and that many features of the results are determined by features of the generic ensemble of such matrices, the CI ensemble of Altland and Zirnbauer originating in mesoscopic physics. We study some simple examples in detail, exhibiting various factors which can favor low or high scale supersymmetry breaking.

  1. Heat distribution by natural convection

    SciTech Connect

    Balcomb, J.D.

    1985-01-01

    Natural convection can provide adequate heat distribution in many situtations that arise in buildings. This is appropriate, for example, in passive solar buildings where some rooms tend to be more strongly solar heated than others or to reduce the number of heating units required in a building. Natural airflow and heat transport through doorways and other internal building apertures is predictable and can be accounted for in the design. The nature of natural convection is described, and a design chart is presented appropriate to a simple, single-doorway situation. Natural convective loops that can occur in buildings are described and a few design guidelines are presented.

  2. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  3. Workshop on momentum distributions: Summary

    SciTech Connect

    Simmons, R.O.

    1988-01-01

    This has been an extraordinary Workshop touching many branches of physics. The Workshop has treated momentum distributions in fluid and solid condensed matter, in nuclei, and in electronic systems. Both theoretical and experimental concepts and methods have been considered in all these branches. A variety of specific illustrations and applications in physical systems have been presented. One finds that some common unifying themes emerge. One finds, also, that some examples are available to illustrate where one branch is more mature than others and to contrast where expectations for future progress may be most encouraged. 6 refs., 2 figs.

  4. Distributed Control with Collective Intelligence

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Wheeler, Kevin R.; Tumer, Kagan

    1998-01-01

    We consider systems of interacting reinforcement learning (RL) algorithms that do not work at cross purposes , in that their collective behavior maximizes a global utility function. We call such systems COllective INtelligences (COINs). We present the theory of designing COINs. Then we present experiments validating that theory in the context of two distributed control problems: We show that COINs perform near-optimally in a difficult variant of Arthur's bar problem [Arthur] (and in particular avoid the tragedy of the commons for that problem), and we also illustrate optimal performance in the master-slave problem.

  5. Distributed wireless quantum communication networks

    NASA Astrophysics Data System (ADS)

    Yu, Xu-Tao; Xu, Jin; Zhang, Zai-Chen

    2013-09-01

    The distributed wireless quantum communication network (DWQCN) has a distributed network topology and transmits information by quantum states. In this paper, we present the concept of the DWQCN and propose a system scheme to transfer quantum states in the DWQCN. The system scheme for transmitting information between any two nodes in the DWQCN includes a routing protocol and a scheme for transferring quantum states. The routing protocol is on-demand and the routing metric is selected based on the number of entangled particle pairs. After setting up a route, quantum teleportation and entanglement swapping are used for transferring quantum states. Entanglement swapping is achieved along with the process of routing set up and the acknowledgment packet transmission. The measurement results of each entanglement swapping are piggybacked with route reply packets or acknowledgment packets. After entanglement swapping, a direct quantum link between source and destination is set up and quantum states are transferred by quantum teleportation. Adopting this scheme, the measurement results of entanglement swapping do not need to be transmitted specially, which decreases the wireless transmission cost and transmission delay.

  6. Distributed nestmate recognition in ants

    PubMed Central

    Esponda, Fernando; Gordon, Deborah M.

    2015-01-01

    We propose a distributed model of nestmate recognition, analogous to the one used by the vertebrate immune system, in which colony response results from the diverse reactions of many ants. The model describes how individual behaviour produces colony response to non-nestmates. No single ant knows the odour identity of the colony. Instead, colony identity is defined collectively by all the ants in the colony. Each ant responds to the odour of other ants by reference to its own unique decision boundary, which is a result of its experience of encounters with other ants. Each ant thus recognizes a particular set of chemical profiles as being those of non-nestmates. This model predicts, as experimental results have shown, that the outcome of behavioural assays is likely to be variable, that it depends on the number of ants tested, that response to non-nestmates changes over time and that it changes in response to the experience of individual ants. A distributed system allows a colony to identify non-nestmates without requiring that all individuals have the same complete information and helps to facilitate the tracking of changes in cuticular hydrocarbon profiles, because only a subset of ants must respond to provide an adequate response. PMID:25833853

  7. Agent Communications using Distributed Metaobjects

    SciTech Connect

    Goldsmith, Steven Y.; Spires, Shannon V.

    1999-06-10

    There are currently two proposed standards for agent communication languages, namely, KQML (Finin, Lobrou, and Mayfield 1994) and the FIPA ACL. Neither standard has yet achieved primacy, and neither has been evaluated extensively in an open environment such as the Internet. It seems prudent therefore to design a general-purpose agent communications facility for new agent architectures that is flexible yet provides an architecture that accepts many different specializations. In this paper we exhibit the salient features of an agent communications architecture based on distributed metaobjects. This architecture captures design commitments at a metaobject level, leaving the base-level design and implementation up to the agent developer. The scope of the metamodel is broad enough to accommodate many different communication protocols, interaction protocols, and knowledge sharing regimes through extensions to the metaobject framework. We conclude that with a powerful distributed object substrate that supports metaobject communications, a general framework can be developed that will effectively enable different approaches to agent communications in the same agent system. We have implemented a KQML-based communications protocol and have several special-purpose interaction protocols under development.

  8. Plankton distribution and ocean dispersal.

    PubMed

    McManus, Margaret Anne; Woodson, C Brock

    2012-03-15

    Plankton are small organisms that dwell in oceans, seas and bodies of fresh water. In this review, we discuss life in the plankton, which involves a balance between the behavioral capabilities of the organism and the characteristics and movement of the water that surrounds it. In order to consider this balance, we discuss how plankton interact with their environment across a range of scales - from the smallest viruses and bacteria to larger phytoplankton and zooplankton. We find that the larger scale distributions of plankton, observed in coastal waters, along continental shelves and in ocean basins, are highly dependent upon the smaller scale interactions between the individual organism and its environment. Further, we discuss how larger scale organism distributions may affect the transport and/or retention of plankton in the ocean environment. The research reviewed here provides a mechanistic understanding of how organism behavior in response to the physical environment produces planktonic aggregations, which has a direct impact on the way marine ecosystems function. PMID:22357594

  9. Analyzing ion distributions around DNA.

    PubMed

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  10. Analyzing ion distributions around DNA

    PubMed Central

    Lavery, Richard; Maddocks, John H.; Pasi, Marco; Zakrzewska, Krystyna

    2014-01-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  11. Overlapping clusters for distributed computation.

    SciTech Connect

    Mirrokni, Vahab; Andersen, Reid; Gleich, David F.

    2010-11-01

    Scalable, distributed algorithms must address communication problems. We investigate overlapping clusters, or vertex partitions that intersect, for graph computations. This setup stores more of the graph than required but then affords the ease of implementation of vertex partitioned algorithms. Our hope is that this technique allows us to reduce communication in a computation on a distributed graph. The motivation above draws on recent work in communication avoiding algorithms. Mohiyuddin et al. (SC09) design a matrix-powers kernel that gives rise to an overlapping partition. Fritzsche et al. (CSC2009) develop an overlapping clustering for a Schwarz method. Both techniques extend an initial partitioning with overlap. Our procedure generates overlap directly. Indeed, Schwarz methods are commonly used to capitalize on overlap. Elsewhere, overlapping communities (Ahn et al, Nature 2009; Mishra et al. WAW2007) are now a popular model of structure in social networks. These have long been studied in statistics (Cole and Wishart, CompJ 1970). We present two types of results: (i) an estimated swapping probability {rho}{infinity}; and (ii) the communication volume of a parallel PageRank solution (link-following {alpha} = 0.85) using an additive Schwarz method. The volume ratio is the amount of extra storage for the overlap (2 means we store the graph twice). Below, as the ratio increases, the swapping probability and PageRank communication volume decreases.

  12. Distributed computing at the SSCL

    SciTech Connect

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given.

  13. Distributed nestmate recognition in ants.

    PubMed

    Esponda, Fernando; Gordon, Deborah M

    2015-05-01

    We propose a distributed model of nestmate recognition, analogous to the one used by the vertebrate immune system, in which colony response results from the diverse reactions of many ants. The model describes how individual behaviour produces colony response to non-nestmates. No single ant knows the odour identity of the colony. Instead, colony identity is defined collectively by all the ants in the colony. Each ant responds to the odour of other ants by reference to its own unique decision boundary, which is a result of its experience of encounters with other ants. Each ant thus recognizes a particular set of chemical profiles as being those of non-nestmates. This model predicts, as experimental results have shown, that the outcome of behavioural assays is likely to be variable, that it depends on the number of ants tested, that response to non-nestmates changes over time and that it changes in response to the experience of individual ants. A distributed system allows a colony to identify non-nestmates without requiring that all individuals have the same complete information and helps to facilitate the tracking of changes in cuticular hydrocarbon profiles, because only a subset of ants must respond to provide an adequate response.

  14. Countercurrent distribution of biological cells

    NASA Technical Reports Server (NTRS)

    1982-01-01

    It is known that the addition of phosphate buffer to two polymer aqueous phase systems has a strong effect on the partition behavior of cells and other particles in such mixtures. The addition of sodium phosphate to aqueous poly(ethylene glycol) dextran phase systems causes a concentration-dependent shift in binodial on the phase diagram, progressively lowering the critical conditions for phase separation as the phosphate concentration is increased. Sodium chloride produces no significant shift in the critical point relative to the salt-free case. Accurate determinations of the phase diagram require measurements of the density of the phases; data is presented which allows this parameter to be calculated from polarimetric measurements of the dextran concentrations of both phases. Increasing polymer concentrations in the phase systems produce increasing preference of the phosphate for the dextran-rich bottom phase. Equilibrium dialysis experiments showed that poly(ethylene glycol) effectively rejected phosphate, and to a lesser extent chloride, but that dextran had little effect on the distribution of either salt. Increasing ionic strength via addition of 0.15 M NaCl to phase systems containing 0.01 M phosphate produces an increased concentration of phosphate ions in the bottom dextran-rich phase, the expected effect in this type of Donnan distribution.

  15. The genetics of fat distribution.

    PubMed

    Schleinitz, Dorit; Böttcher, Yvonne; Blüher, Matthias; Kovacs, Peter

    2014-07-01

    Fat stored in visceral depots makes obese individuals more prone to complications than subcutaneous fat. There is good evidence that body fat distribution (FD) is controlled by genetic factors. WHR, a surrogate measure of FD, shows significant heritability of up to ∼60%, even after adjusting for BMI. Genetic variants have been linked to various forms of altered FD such as lipodystrophies; however, the polygenic background of visceral obesity has only been sparsely investigated in the past. Recent genome-wide association studies (GWAS) for measures of FD revealed numerous loci harbouring genes potentially regulating FD. In addition, genes with fat depot-specific expression patterns (in particular subcutaneous vs visceral adipose tissue) provide plausible candidate genes involved in the regulation of FD. Many of these genes are differentially expressed in various fat compartments and correlate with obesity-related traits, thus further supporting their role as potential mediators of metabolic alterations associated with a distinct FD. Finally, developmental genes may at a very early stage determine specific FD in later life. Indeed, genes such as TBX15 not only manifest differential expression in various fat depots, but also correlate with obesity and related traits. Moreover, recent GWAS identified several polymorphisms in developmental genes (including TBX15, HOXC13, RSPO3 and CPEB4) strongly associated with FD. More accurate methods, including cardiometabolic imaging, for assessment of FD are needed to promote our understanding in this field, where the main focus is now to unravel the yet unknown biological function of these novel 'fat distribution genes'.

  16. Detector decoy quantum key distribution

    NASA Astrophysics Data System (ADS)

    Moroder, Tobias; Curty, Marcos; Lütkenhaus, Norbert

    2009-04-01

    Photon number resolving detectors can enhance the performance of many practical quantum cryptographic setups. In this paper, we employ a simple method to estimate the statistics provided by such a photon number resolving detector using only a threshold detector together with a variable attenuator. This idea is similar in spirit to that of the decoy state technique, and is especially suited to those scenarios where only a few parameters of the photon number statistics of the incoming signals have to be estimated. As an illustration of the potential applicability of the method in quantum communication protocols, we use it to prove security of an entanglement-based quantum key distribution scheme with an untrusted source without the need for a squash model and by solely using this extra idea. In this sense, this detector decoy method can be seen as a different conceptual approach to adapt a single-photon security proof to its physical, full optical implementation. We show that in this scenario, the legitimate users can now even discard the double click events from the raw key data without compromising the security of the scheme, and we present simulations on the performance of the BB84 and the 6-state quantum key distribution protocols.

  17. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  18. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC's perspective was to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.'' This translated into evaluating how easy it was to port ELROS over CRI's ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC's side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI's goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  19. Distance distribution in configuration-model networks

    NASA Astrophysics Data System (ADS)

    Nitzan, Mor; Katzav, Eytan; Kühn, Reimer; Biham, Ofer

    2016-06-01

    We present analytical results for the distribution of shortest path lengths between random pairs of nodes in configuration model networks. The results, which are based on recursion equations, are shown to be in good agreement with numerical simulations for networks with degenerate, binomial, and power-law degree distributions. The mean, mode, and variance of the distribution of shortest path lengths are also evaluated. These results provide expressions for central measures and dispersion measures of the distribution of shortest path lengths in terms of moments of the degree distribution, illuminating the connection between the two distributions.

  20. Why the distribution of medical errors matters.

    PubMed

    McLean, Thomas R

    2015-07-01

    During the last decade, interventions to reduce the number of medical errors have been largely ineffective. Although it is widely assumed that medical errors follow a Gaussian distribution, they may actually follow a Power Rule distribution. This article presents the evidence in favor of a Power Rule distribution for medical errors and then examines the consequences of such a distribution for medical errors. As the distribution of medical errors has real-world implications, further research is needed to determine whether medical errors follow a Gaussian or Power Rule distribution.

  1. Distance distribution in configuration-model networks.

    PubMed

    Nitzan, Mor; Katzav, Eytan; Kühn, Reimer; Biham, Ofer

    2016-06-01

    We present analytical results for the distribution of shortest path lengths between random pairs of nodes in configuration model networks. The results, which are based on recursion equations, are shown to be in good agreement with numerical simulations for networks with degenerate, binomial, and power-law degree distributions. The mean, mode, and variance of the distribution of shortest path lengths are also evaluated. These results provide expressions for central measures and dispersion measures of the distribution of shortest path lengths in terms of moments of the degree distribution, illuminating the connection between the two distributions. PMID:27415282

  2. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  3. Water Distribution and Removal Model

    SciTech Connect

    Y. Deng; N. Chipman; E.L. Hardin

    2005-08-26

    The design of the Yucca Mountain high level radioactive waste repository depends on the performance of the engineered barrier system (EBS). To support the total system performance assessment (TSPA), the Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is developed to describe the thermal, mechanical, chemical, hydrological, biological, and radionuclide transport processes within the emplacement drifts, which includes the following major analysis/model reports (AMRs): (1) EBS Water Distribution and Removal (WD&R) Model; (2) EBS Physical and Chemical Environment (P&CE) Model; (3) EBS Radionuclide Transport (EBS RNT) Model; and (4) EBS Multiscale Thermohydrologic (TH) Model. Technical information, including data, analyses, models, software, and supporting documents will be provided to defend the applicability of these models for their intended purpose of evaluating the postclosure performance of the Yucca Mountain repository system. The WD&R model ARM is important to the site recommendation. Water distribution and removal represents one component of the overall EBS. Under some conditions, liquid water will seep into emplacement drifts through fractures in the host rock and move generally downward, potentially contacting waste packages. After waste packages are breached by corrosion, some of this seepage water will contact the waste, dissolve or suspend radionuclides, and ultimately carry radionuclides through the EBS to the near-field host rock. Lateral diversion of liquid water within the drift will occur at the inner drift surface, and more significantly from the operation of engineered structures such as drip shields and the outer surface of waste packages. If most of the seepage flux can be diverted laterally and removed from the drifts before contacting the wastes, the release of radionuclides from the EBS can be controlled, resulting in a proportional reduction in dose release at the accessible environment. The purposes

  4. Duplex Direct Data Distribution System

    NASA Technical Reports Server (NTRS)

    Greenfield, Israel (Technical Monitor)

    2001-01-01

    The NASA Glenn Research Center (GRC) is developing and demonstrating communications and network technologies that are helping to enable the near-Earth space Internet. GRC envisions several service categories. The first of these categories is direct data distribution or D3 (pronounced "D-cubed"). Commercially provided D3 will make it possible to download a data set from a spacecraft, like the International Space Station. as easily as one can extract a file from a remote server today, using a file transfer protocol. In a second category, NASA spacecraft will make use of commercial satellite communication (SATCOM) systems. Some of those services will come from purchasing time on unused transponders that cover landmasses. While it is likely there will be gaps in service coverage, Internet services should be available using these systems. This report addresses alternative methods of implementing a full duplex enhancement of the GRC developed experimental Ka-Band Direct Data Distribution (D3) space-to-ground communication link. The resulting duplex version is called the Duplex Direct Data Distribution (D4) system. The D4 system is intended to provide high-data-rate commercial direct or internet-based communications service between the NASA spacecraft in low earth orbit (LEO) and the respective principal investigators associated with these spacecraft. Candidate commercial services were assessed regarding their near-term potential to meet NASA requirements. Candidates included Ka-band and V-band geostationary orbit and non-geostationary orbit satellite relay services and direct downlink ("LEO teleport") services. End-to-end systems concepts were examined and characterized in terms of alternative link layer architectures. Alternatives included a Direct Link, a Relay Link, a Hybrid Link, and a Dual Mode Link. The direct link assessment examined sample ground terminal placements and antenna angle issues. The SATCOM-based alternatives examined existing or proposed commercial

  5. Density Distributions of Cyclotrimethylenetrinitramines (RDX)

    SciTech Connect

    Hoffman, D M

    2002-03-19

    As part of the US Army Foreign Comparative Testing (FCT) program the density distributions of six samples of class 1 RDX were measured using the density gradient technique. This technique was used in an attempt to distinguish between RDX crystallized by a French manufacturer (designated insensitive or IRDX) from RDX manufactured at Holston Army Ammunition Plant (HAAP), the current source of RDX for Department of Defense (DoD). Two samples from different lots of French IRDX had an average density of 1.7958 {+-} 0.0008 g/cc. The theoretical density of a perfect RDX crystal is 1.806 g/cc. This yields 99.43% of the theoretical maximum density (TMD). For two HAAP RDX lots the average density was 1.786 {+-} 0.002 g/cc, only 98.89% TMD. Several other techniques were used for preliminary characterization of one lot of French IRDX and two lot of HAAP RDX. Light scattering, SEM and polarized optical microscopy (POM) showed that SNPE and Holston RDX had the appropriate particle size distribution for Class 1 RDX. High performance liquid chromatography showed quantities of HMX in HAAP RDX. French IRDX also showed a 1.1 C higher melting point compared to HAAP RDX in the differential scanning calorimetry (DSC) consistent with no melting point depression due to the HMX contaminant. A second part of the program involved characterization of Holston RDX recrystallized using the French process. After reprocessing the average density of the Holston RDX was increased to 1.7907 g/cc. Apparently HMX in RDX can act as a nucleating agent in the French RDX recrystallization process. The French IRDX contained no HMX, which is assumed to account for its higher density and narrower density distribution. Reprocessing of RDX from Holston improved the average density compared to the original Holston RDX, but the resulting HIRDX was not as dense as the original French IRDX. Recrystallized Holston IRDX crystals were much larger (3-500 {micro}m or more) then either the original class 1 HAAP RDX or

  6. Distributed Aviation Concepts and Technologies

    NASA Technical Reports Server (NTRS)

    Moore, Mark D.

    2008-01-01

    Aviation has experienced one hundred years of evolution, resulting in the current air transportation system dominated by commercial airliners in a hub and spoke infrastructure. While the first fifty years involved disruptive technologies that required frequent vehicle adaptation, the second fifty years produced a stable evolutionary optimization of decreasing costs with increasing safety. This optimization has resulted in traits favoring a centralized service model with high vehicle productivity and cost efficiency. However, it may also have resulted in a system that is not sufficiently robust to withstand significant system disturbances. Aviation is currently facing rapid change from issues such as environmental damage, terrorism threat, congestion and capacity limitations, and cost of energy. Currently, these issues are leading to a loss of service for weaker spoke markets. These catalysts and a lack of robustness could result in a loss of service for much larger portions of the aviation market. The impact of other competing transportation services may be equally important as casual factors of change. Highway system forecasts indicate a dramatic slow down as congestion reaches a point of non-linearly increasing delay. In the next twenty-five years, there is the potential for aviation to transform itself into a more robust, scalable, adaptive, secure, safe, affordable, convenient, efficient and environmentally friendly system. To achieve these characteristics, the new system will likely be based on a distributed model that enables more direct services. Short range travel is already demonstrating itself to be inefficient with a centralized model, providing opportunities for emergent distributed services through air-taxi models. Technologies from the on-demand revolution in computers and communications are now available as major drivers for aviation on-demand adaptation. Other technologies such as electric propulsion are currently transforming the automobile

  7. Can Data Recognize Its Parent Distribution?

    SciTech Connect

    A.W.Marshall; J.C.Meza; and I. Olkin

    1999-05-01

    This study is concerned with model selection of lifetime and survival distributions arising in engineering reliability or in the medical sciences. We compare various distributions, including the gamma, Weibull and lognormal, with a new distribution called geometric extreme exponential. Except for the lognormal distribution, the other three distributions all have the exponential distribution as special cases. A Monte Carlo simulation was performed to determine sample sizes for which survival distributions can distinguish data generated by their own families. Two methods for decision are by maximum likelihood and by Kolmogorov distance. Neither method is uniformly best. The probability of correct selection with more than one alternative shows some surprising results when the choices are close to the exponential distribution.

  8. A guided tour of new tempered distributions

    NASA Astrophysics Data System (ADS)

    Schmeelk, John

    1990-10-01

    Laurent Schwartz, the principle architect of distribution theory, presented the impossibility of extending a form of multiplication to distribution theory. There have been many varieties of partial solutions to this problem. Some of the solutions contain heuristic computations done by physicists in quantum field theory. A recent strategy developed by J. Colombeau culminates with multiplication and integration theory for distributions. This paper develops this theory in the spirit of a sequence approach, much like fundamental sequences are to distributions. However, in the new tempered distribution theory the sequences can be noncountable. T. Todorov developed these techniques for new distributions. However, since so many applications require Fourier analysis, the new tempered distributions provide a natural setting for physics and signal analysis. The paper illustrates the product of two Dirac delta functionals, δ( x) δ( x). Other nonregular distributional products can also be computed in the same manner. The paper culminates with a new application of annihilation and creation operators in quantum field theory.

  9. Array distribution in data-parallel programs

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert; Sheffler, Thomas J.

    1994-01-01

    We consider distribution at compile time of the array data in a distributed-memory implementation of a data-parallel program written in a language like Fortran 90. We allow dynamic redistribution of data and define a heuristic algorithmic framework that chooses distribution parameters to minimize an estimate of program completion time. We represent the program as an alignment-distribution graph. We propose a divide-and-conquer algorithm for distribution that initially assigns a common distribution to each node of the graph and successively refines this assignment, taking computation, realignment, and redistribution costs into account. We explain how to estimate the effect of distribution on computation cost and how to choose a candidate set of distributions. We present the results of an implementation of our algorithms on several test problems.

  10. Electricity distribution networks: Changing regulatory approaches

    NASA Astrophysics Data System (ADS)

    Cambini, Carlo

    2016-09-01

    Increasing the penetration of distributed generation and smart grid technologies requires substantial investments. A study proposes an innovative approach that combines four regulatory tools to provide economic incentives for distribution system operators to facilitate these innovative practices.

  11. Distributed PV Adoption in Maine Through 2021

    SciTech Connect

    Gagnon, Pieter; Sigrin, Ben

    2015-11-06

    NREL has used its dSolar (distributed solar) model to generate low-medium-high estimates of distributed PV adoption in Maine through 2021. This presentation gives a high-level overview of the model and modeling results.

  12. Patterns of Drug Distribution: Implications and Issues#

    PubMed Central

    Johnson, Bruce D.

    2007-01-01

    This article delineates various patterns of illicit sales of drugs, especially at the retail (and near-retail) level, addressing a variety of central issues about drug sales and distribution documented during the past 30 years, including: a) the links between drug consumption and drug distribution activities; b) the various distribution roles; c) various levels of the distribution hierarchy; d) types of retail and wholesale markets; e) the association of drug distribution with nondrug associated criminality and violence. The article also will address the implications of drug distribution: whether various public policies such as supply reduction and source interdiction affect illicit drug markets, and how policing strategies and various law enforcement strategies can influence the involvement of individual participation in drug distribution activities. The overlooked contribution of treatment for “drug abuse” to reducing drug sales and distribution activities also will be considered as will other critical unresolved issues. PMID:14582578

  13. The Molecular Weight Distribution of Polymer Samples

    ERIC Educational Resources Information Center

    Horta, Arturo; Pastoriza, M. Alejandra

    2007-01-01

    Various methods for the determination of the molecular weight distribution (MWD) of different polymer samples are presented. The study shows that the molecular weight averages and distribution of a polymerization completely depend on the characteristics of the reaction itself.

  14. Distributive Justice and the Moral Development Curriculum.

    ERIC Educational Resources Information Center

    Krogh, Suzanne Lowell; Lamme, Linda Leonard

    1985-01-01

    Teaching strategies to help elementary social studies teachers teach distributive justice--i.e., fair sharing of available resources--are provided. Also described are the approximate age levels and the different levels of reasoning associated with distributive justice. (RM)

  15. Radon Transform and Light-Cone Distributions

    NASA Astrophysics Data System (ADS)

    Teryaev, O. V.

    2016-08-01

    The relevance of Radon transform for generalized and transverse momentum dependent parton distributions is discussed. The new application for conditional (fracture) parton distributions and dihadron fragmentation functions is suggested.

  16. Telemedicine and distributed medical intelligence.

    PubMed

    Warner, D; Tichenor, J M; Balch, D C

    1996-01-01

    Recent trends in health care informatics and telemedicine indicate that systems are being developed with a primary focus on technology and business, not on the process of medicine itself. The authors present a new model of health care information, distributed medical intelligence, which promotes the development of an integrative medical communication system addressing the process of providing expert medical knowledge to the point of need. The model incorporates audio, video, high-resolution still images, and virtual reality applications into an integrated medical communications network. Three components of the model (care portals, Docking Station, and the bridge) are described. The implementation of this model at the East Carolina University School of Medicine is also outlined. PMID:10165366

  17. Distributed Data Flow Signal Processors

    NASA Astrophysics Data System (ADS)

    Eggert, Jay A.

    1982-12-01

    Near term advances in technology such as VHSIC promise revolutionary progress in programmable signal processor capabilities. However, meeting projected signal processing requirements for radar, sonar and other high throughput systems requires effective multi-processor networks. This paper describes a distributed signal processor architecture currently in development at Texas Instruments that is designed to meet these high through-put, multi-mode system requirements. The approach supports multiple, functionally spe-cialized, autonomous nodes (processors) interconnected via a flexible, high speed communication network. A common task scheduling mechanism based upon "data flow" concepts provides an efficient high level programming and simulation mechanism. The Ada syntax compatible task level programming and simulation software support tools are also described.

  18. Concepts for Distributed Engine Control

    NASA Technical Reports Server (NTRS)

    Culley, Dennis E.; Thomas, Randy; Saus, Joseph

    2007-01-01

    Gas turbine engines for aero-propulsion systems are found to be highly optimized machines after over 70 years of development. Still, additional performance improvements are sought while reduction in the overall cost is increasingly a driving factor. Control systems play a vitally important part in these metrics but are severely constrained by the operating environment and the consequences of system failure. The considerable challenges facing future engine control system design have been investigated. A preliminary analysis has been conducted of the potential benefits of distributed control architecture when applied to aero-engines. In particular, reductions in size, weight, and cost of the control system are possible. NASA is conducting research to further explore these benefits, with emphasis on the particular benefits enabled by high temperature electronics and an open-systems approach to standardized communications interfaces.

  19. Embodied and Distributed Parallel DJing.

    PubMed

    Cappelen, Birgitta; Andersson, Anders-Petter

    2016-01-01

    Everyone has a right to take part in cultural events and activities, such as music performances and music making. Enforcing that right, within Universal Design, is often limited to a focus on physical access to public areas, hearing aids etc., or groups of persons with special needs performing in traditional ways. The latter might be people with disabilities, being musicians playing traditional instruments, or actors playing theatre. In this paper we focus on the innovative potential of including people with special needs, when creating new cultural activities. In our project RHYME our goal was to create health promoting activities for children with severe disabilities, by developing new musical and multimedia technologies. Because of the users' extreme demands and rich contribution, we ended up creating both a new genre of musical instruments and a new art form. We call this new art form Embodied and Distributed Parallel DJing, and the new genre of instruments for Empowering Multi-Sensorial Things.

  20. Cooperative Fault Tolerant Distributed Computing

    SciTech Connect

    Fagg, Graham E.

    2006-03-15

    HARNESS was proposed as a system that combined the best of emerging technologies found in current distributed computing research and commercial products into a very flexible, dynamically adaptable framework that could be used by applications to allow them to evolve and better handle their execution environment. The HARNESS system was designed using the considerable experience from previous projects such as PVM, MPI, IceT and Cumulvs. As such, the system was designed to avoid any of the common problems found with using these current systems, such as no single point of failure, ability to survive machine, node and software failures. Additional features included improved inter-component connectivity, with full support for dynamic down loading of addition components at run-time thus reducing the stress on application developers to build in all the libraries they need in advance.