Science.gov

Sample records for non-randomly distributed locations-exemplified

  1. Regulatory Considerations Of Waste Emplacement Within The WIPP Repository: Random Versus Non-Random Distribution

    SciTech Connect

    Casey, S. C.; Patterson, R. L.; Gross, M.; Lickliter, K.; Stein, J. S.

    2003-02-25

    The U.S. Department of Energy (DOE) is responsible for disposing of transuranic waste in the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. As part of that responsibility, DOE must comply with the U.S. Environmental Protection Agency's (EPA) radiation protection standards in Title 40 Code of Federal Regulations (CFR), Parts 191 and 194. This paper addresses compliance with the criteria of 40 CFR Section 194.24(d) and 194.24(f) that require DOE to either provide a waste loading scheme for the WIPP repository or to assume random emplacement in the mandated performance and compliance assessments. The DOE established a position on waste loading schemes during the process of obtaining the EPA's initial Certification in 1998. The justification for utilizing a random waste emplacement distribution within the WIPP repository was provided to the EPA. During the EPA rulemaking process for the initial certification, the EPA questioned DOE on whether waste would be loaded randomly as modeled in long-term performance assessment (PA) and the impact, if any, of nonrandom loading. In response, DOE conducted an impact assessment for non-random waste loading. The results of this assessment supported the contention that it does not matter whether random or non-random waste loading is assumed for the PA. The EPA determined that a waste loading plan was unnecessary because DOE had assumed random waste loading and evaluated the potential consequences of non-random loading for a very high activity waste stream. In other words, the EPA determined that DOE was not required to provide a waste loading scheme because compliance is not affected by the actual distribution of waste containers in the WIPP.

  2. Non-random distribution of DNA double-strand breaks induced by particle irradiation

    NASA Technical Reports Server (NTRS)

    Lobrich, M.; Cooper, P. K.; Rydberg, B.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    Induction of DNA double-strand breaks (dsbs) in mammalian cells is dependent on the spatial distribution of energy deposition from the ionizing radiation. For high LET particle radiations the primary ionization sites occur in a correlated manner along the track of the particles, while for X-rays these sites are much more randomly distributed throughout the volume of the cell. It can therefore be expected that the distribution of dsbs linearly along the DNA molecule also varies with the type of radiation and the ionization density. Using pulsed-field gel and conventional gel techniques, we measured the size distribution of DNA molecules from irradiated human fibroblasts in the total range of 0.1 kbp-10 Mbp for X-rays and high LET particles (N ions, 97 keV/microns and Fe ions, 150 keV/microns). On a mega base pair scale we applied conventional pulsed-field gel electrophoresis techniques such as measurement of the fraction of DNA released from the well (FAR) and measurement of breakage within a specific NotI restriction fragment (hybridization assay). The induction rate for widely spaced breaks was found to decrease with LET. However, when the entire distribution of radiation-induced fragments was analysed, we detected an excess of fragments with sizes below about 200 kbp for the particles compared with X-irradiation. X-rays are thus more effective than high LET radiations in producing large DNA fragments but less effective in the production of smaller fragments. We determined the total induction rate of dsbs for the three radiations based on a quantitative analysis of all the measured radiation-induced fragments and found that the high LET particles were more efficient than X-rays at inducing dsbs, indicating an increasing total efficiency with LET. Conventional assays that are based only on the measurement of large fragments are therefore misleading when determining total dsb induction rates of high LET particles. The possible biological significance of this non-randomness

  3. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Horz, Friedrich; Westphal, Andrew J.; Gainsforth, Zack; Borg, Janet; Djouadi, Zahia; Bridges, John; Franchi, Ian; Brownlee, Donald E.; Cheng. Andrew F.; Clark, Benton C.; Floss, Christine

    2007-01-01

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  4. Discovery of Non-random Spatial Distribution of Impacts in the Stardust Cometary Collector

    SciTech Connect

    Westphal, A J; Bastien, R K; Borg, J; Bridges, J; Brownlee, D E; Burchell, M J; Cheng, A F; Clark, B C; Djouadi, Z; Floss, C; Franchi, I; Gainsforth, Z; Graham, G; Green, S F; Heck, P R; Horanyi, M; Hoppe, P; Horz, F P; Huth, J; Kearsley, A; Leroux, H; Marhas, K; Nakamura-Messenger, K; Sandford, S A; See, T H; Stadermann, F J; Teslich, N E; Tsitrin, S; Warren, J L; Wozniakiewicz, P J; Zolensky, M E

    2007-04-06

    We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than {approx} 10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a non-cometary impact on the spacecraft bus just forward of the collector.

  5. Non-Random Spatial Distribution of Impacts in the Stardust Cometary Collector

    NASA Technical Reports Server (NTRS)

    Westphal, Andrew J.; Bastien, Ronald K.; Borg, Janet; Bridges, John; Brownlee, Donald E.; Burchell, Mark J.; Cheng, Andrew F.; Clark, Benton C.; Djouadi, Zahia; Floss, Christine

    2007-01-01

    In January 2004, the Stardust spacecraft flew through the coma of comet P81/Wild2 at a relative speed of 6.1 km/sec. Cometary dust was collected at in a 0.1 sq m collector consisting of aerogel tiles and aluminum foils. Two years later, the samples successfully returned to earth and were recovered. We report the discovery that impacts in the Stardust cometary collector are not distributed randomly in the collecting media, but appear to be clustered on scales smaller than approx.10 cm. We also report the discovery of at least two populations of oblique tracks. We evaluated several hypotheses that could explain the observations. No hypothesis was consistent with all the observations, but the preponderance of evidence points toward at least one impact on the central Whipple shield of the spacecraft as the origin of both clustering and low-angle oblique tracks. High-angle oblique tracks unambiguously originate from a noncometary impact on the spacecraft bus just forward of the collector. Here we summarize the observations, and review the evidence for and against three scenarios that we have considered for explaining the impact clustering found on the Stardust aerogel and foil collectors.

  6. Non-random cation distribution in hexagonal Al 0.5Ga 0.5PO 4

    NASA Astrophysics Data System (ADS)

    Kulshreshtha, S. K.; Jayakumar, O. D.; Sudarsan, V.

    2010-05-01

    Based on powder X-ray diffraction and 31P Magic Angle Spinning Nuclear Magnetic Resonance (MAS NMR) investigations of mixed phosphate Al 0.5Ga 0.5PO 4, prepared by co-precipitation method followed by annealing at 900 °C for 24 h, it is shown that Al 0.5Ga 0.5PO 4 phase crystallizes in hexagonal form with lattice parameter a=0.491(2) and c=1.106(4) nm. This hexagonal phase of Al 0.5Ga 0.5PO 4 is similar to that of pure GaPO 4. The 31P MAS NMR spectrum of the mixed phosphate sample consists of five peaks with systematic variation of their chemical shift values and is arising due to existence of P structural units having varying number of the Al 3+/Ga 3+ cations as the next nearest neighbors in the solid solution. Based on the intensity analysis of the component NMR spectra of Al 0.5Ga 0.5PO 4, it is inferred that the distribution of Al 3+ and Ga 3+ cations is non-random for the hexagonal Al 0.5Ga 0.5PO 4 sample although XRD patterns showed a well-defined solid solution formation.

  7. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats

    PubMed Central

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A.; Bortolotti, Gary R.; Tella, José L.

    2015-01-01

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes. PMID:26348294

  8. Links between fear of humans, stress and survival support a non-random distribution of birds among urban and rural habitats.

    PubMed

    Rebolo-Ifrán, Natalia; Carrete, Martina; Sanz-Aguilar, Ana; Rodríguez-Martínez, Sol; Cabezas, Sonia; Marchant, Tracy A; Bortolotti, Gary R; Tella, José L

    2015-01-01

    Urban endocrine ecology aims to understand how organisms cope with new sources of stress and maintain allostatic load to thrive in an increasingly urbanized world. Recent research efforts have yielded controversial results based on short-term measures of stress, without exploring its fitness effects. We measured feather corticosterone (CORTf, reflecting the duration and amplitude of glucocorticoid secretion over several weeks) and subsequent annual survival in urban and rural burrowing owls. This species shows high individual consistency in fear of humans (i.e., flight initiation distance, FID), allowing us to hypothesize that individuals distribute among habitats according to their tolerance to human disturbance. FIDs were shorter in urban than in rural birds, but CORTf levels did not differ, nor were correlated to FIDs. Survival was twice as high in urban as in rural birds and links with CORTf varied between habitats: while a quadratic relationship supports stabilizing selection in urban birds, high predation rates may have masked CORTf-survival relationship in rural ones. These results evidence that urban life does not constitute an additional source of stress for urban individuals, as shown by their near identical CORTf values compared with rural conspecifics supporting the non-random distribution of individuals among habitats according to their behavioural phenotypes. PMID:26348294

  9. Impact of the β-Lactam Resistance Modifier (−)-Epicatechin Gallate on the Non-Random Distribution of Phospholipids across the Cytoplasmic Membrane of Staphylococcus aureus

    PubMed Central

    Rosado, Helena; Turner, Robert D.; Foster, Simon J.; Taylor, Peter W.

    2015-01-01

    The polyphenol (−)-epicatechin gallate (ECg) inserts into the cytoplasmic membrane (CM) of methicillin-resistant Staphylococcus aureus (MRSA) and reversibly abrogates resistance to β-lactam antibiotics. ECg elicits an increase in MRSA cell size and induces thickened cell walls. As ECg partially delocalizes penicillin-binding protein PBP2 from the septal division site, reduces PBP2 and PBP2a complexation and induces CM remodelling, we examined the impact of ECg membrane intercalation on phospholipid distribution across the CM and determined if ECg affects the equatorial, orthogonal mode of division. The major phospholipids of the staphylococcal CM, lysylphosphatidylglycerol (LPG), phosphatidylglycerol (PG), and cardiolipin (CL), were distributed in highly asymmetric fashion; 95%–97% of LPG was associated with the inner leaflet whereas PG (~90%) and CL (~80%) were found predominantly in the outer leaflet. ECg elicited small, significant changes in LPG distribution. Atomic force microscopy established that ECg-exposed cells divided in similar fashion to control bacteria, with a thickened band of encircling peptidoglycan representing the most recent plane of cell division, less distinct ribs indicative of previous sites of orthogonal division and concentric rings and “knobbles” representing stages of peptidoglycan remodelling during the cell cycle. Preservation of staphylococcal membrane lipid asymmetry and mode of division in sequential orthogonal planes appear key features of ECg-induced stress. PMID:26213914

  10. Non-random patterns in viral diversity

    PubMed Central

    Anthony, Simon J.; Islam, Ariful; Johnson, Christine; Navarrete-Macias, Isamara; Liang, Eliza; Jain, Komal; Hitchens, Peta L.; Che, Xiaoyu; Soloyvov, Alexander; Hicks, Allison L.; Ojeda-Flores, Rafael; Zambrana-Torrelio, Carlos; Ulrich, Werner; Rostal, Melinda K.; Petrosov, Alexandra; Garcia, Joel; Haider, Najmul; Wolfe, Nathan; Goldstein, Tracey; Morse, Stephen S.; Rahman, Mahmudur; Epstein, Jonathan H.; Mazet, Jonna K.; Daszak, Peter; Lipkin, W. Ian

    2015-01-01

    It is currently unclear whether changes in viral communities will ever be predictable. Here we investigate whether viral communities in wildlife are inherently structured (inferring predictability) by looking at whether communities are assembled through deterministic (often predictable) or stochastic (not predictable) processes. We sample macaque faeces across nine sites in Bangladesh and use consensus PCR and sequencing to discover 184 viruses from 14 viral families. We then use network modelling and statistical null-hypothesis testing to show the presence of non-random deterministic patterns at different scales, between sites and within individuals. We show that the effects of determinism are not absolute however, as stochastic patterns are also observed. In showing that determinism is an important process in viral community assembly we conclude that it should be possible to forecast changes to some portion of a viral community, however there will always be some portion for which prediction will be unlikely. PMID:26391192

  11. Interval process model and non-random vibration analysis

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Ni, B. Y.; Liu, N. Y.; Han, X.; Liu, J.

    2016-07-01

    This paper develops an interval process model for time-varying or dynamic uncertainty analysis when information of the uncertain parameter is inadequate. By using the interval process model to describe a time-varying uncertain parameter, only its upper and lower bounds are required at each time point rather than its precise probability distribution, which is quite different from the traditional stochastic process model. A correlation function is defined for quantification of correlation between the uncertain-but-bounded variables at different times, and a matrix-decomposition-based method is presented to transform the original dependent interval process into an independent one for convenience of subsequent uncertainty analysis. More importantly, based on the interval process model, a non-random vibration analysis method is proposed for response computation of structures subjected to time-varying uncertain external excitations or loads. The structural dynamic responses thus can be derived in the form of upper and lower bounds, providing an important guidance for practical safety analysis and reliability design of structures. Finally, two numerical examples and one engineering application are investigated to demonstrate the feasibility of the interval process model and corresponding non-random vibration analysis method.

  12. Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring

    PubMed Central

    Miner, Daniel; Triesch, Jochen

    2016-01-01

    Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. PMID:26866369

  13. Plasticity-Driven Self-Organization under Topological Constraints Accounts for Non-random Features of Cortical Synaptic Wiring.

    PubMed

    Miner, Daniel; Triesch, Jochen

    2016-02-01

    Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring. PMID:26866369

  14. Non-random DNA fragmentation in next-generation sequencing

    PubMed Central

    Poptsova, Maria S.; Il'icheva, Irina A.; Nechipurenko, Dmitry Yu.; Panchenko, Larisa A.; Khodikov, Mingian V.; Oparina, Nina Y.; Polozov, Robert V.; Nechipurenko, Yury D.; Grokhovsky, Sergei L.

    2014-01-01

    Next Generation Sequencing (NGS) technology is based on cutting DNA into small fragments, and their massive parallel sequencing. The multiple overlapping segments termed “reads” are assembled into a contiguous sequence. To reduce sequencing errors, every genome region should be sequenced several dozen times. This sequencing approach is based on the assumption that genomic DNA breaks are random and sequence-independent. However, previously we showed that for the sonicated restriction DNA fragments the rates of double-stranded breaks depend on the nucleotide sequence. In this work we analyzed genomic reads from NGS data and discovered that fragmentation methods based on the action of the hydrodynamic forces on DNA, produce similar bias. Consideration of this non-random DNA fragmentation may allow one to unravel what factors and to what extent influence the non-uniform coverage of various genomic regions. PMID:24681819

  15. Non-random DNA fragmentation in next-generation sequencing

    NASA Astrophysics Data System (ADS)

    Poptsova, Maria S.; Il'Icheva, Irina A.; Nechipurenko, Dmitry Yu.; Panchenko, Larisa A.; Khodikov, Mingian V.; Oparina, Nina Y.; Polozov, Robert V.; Nechipurenko, Yury D.; Grokhovsky, Sergei L.

    2014-03-01

    Next Generation Sequencing (NGS) technology is based on cutting DNA into small fragments, and their massive parallel sequencing. The multiple overlapping segments termed ``reads'' are assembled into a contiguous sequence. To reduce sequencing errors, every genome region should be sequenced several dozen times. This sequencing approach is based on the assumption that genomic DNA breaks are random and sequence-independent. However, previously we showed that for the sonicated restriction DNA fragments the rates of double-stranded breaks depend on the nucleotide sequence. In this work we analyzed genomic reads from NGS data and discovered that fragmentation methods based on the action of the hydrodynamic forces on DNA, produce similar bias. Consideration of this non-random DNA fragmentation may allow one to unravel what factors and to what extent influence the non-uniform coverage of various genomic regions.

  16. Reducing bias in survival under non-random temporary emigration

    USGS Publications Warehouse

    Peñaloza, Claudia L.; Kendall, William L.; Langtimm, Catherine Ann

    2014-01-01

    Despite intensive monitoring, temporary emigration from the sampling area can induce bias severe enough for managers to discard life-history parameter estimates toward the terminus of the times series (terminal bias). Under random temporary emigration unbiased parameters can be estimated with CJS models. However, unmodeled Markovian temporary emigration causes bias in parameter estimates and an unobservable state is required to model this type of emigration. The robust design is most flexible when modeling temporary emigration, and partial solutions to mitigate bias have been identified, nonetheless there are conditions were terminal bias prevails. Long-lived species with high adult survival and highly variable non-random temporary emigration present terminal bias in survival estimates, despite being modeled with the robust design and suggested constraints. Because this bias is due to uncertainty about the fate of individuals that are undetected toward the end of the time series, solutions should involve using additional information on survival status or location of these individuals at that time. Using simulation, we evaluated the performance of models that jointly analyze robust design data and an additional source of ancillary data (predictive covariate on temporary emigration, telemetry, dead recovery, or auxiliary resightings) in reducing terminal bias in survival estimates. The auxiliary resighting and predictive covariate models reduced terminal bias the most. Additional telemetry data was effective at reducing terminal bias only when individuals were tracked for a minimum of two years. High adult survival of long-lived species made the joint model with recovery data ineffective at reducing terminal bias because of small-sample bias. The naïve constraint model (last and penultimate temporary emigration parameters made equal), was the least efficient, though still able to reduce terminal bias when compared to an unconstrained model. Joint analysis of several

  17. Non random usage of T cell receptor alpha gene expression in atopy using anchored PCR.

    PubMed

    Mansur, A H; Gelder, C M; Holland, D; Campell, D A; Griffin, A; Cunliffe, W; Markham, A F; Morrison, J F

    1996-01-01

    The T cell receptor (TCR) alpha beta heterodimer recognises antigenic peptide fragments presented by Class II MHC. This interaction initiates T cell activation and cytokine release with subsequent recruitment of inflammatory cells. Previous work from our group suggests a qualitative difference in variable alpha gene expression in atopy as compared to non atopic controls. In this study we examine TCR alpha repertoire using anchored PCR to provide a quantitative assessment of the V alpha and J alpha repertoire. One atopic (DRB1*0701,DRB1*15: DRB4*0101, DRB5*01: DQB1* 0303, DQB1*601/2) and one non-atopic (DRB1*0701,DRB1*03011/2: DRB4*01, DRB3*0x: DQB1* 0303, DQB1*0201/2) control were studied. Variable gene usage was markedly limited in the atopic individual. V alpha 1, 3, 8 accounted for 60% and J alpha 12, 31 30% of the gene usage. There was evidence of preferential V alpha-J alpha gene pairing and clonal expansion. We conclude that there is a marked non random TCR alpha gene distribution in atopy using both V alpha family and anchored PCR. This may be due in part to antigen driven clonal expansion. PMID:9095269

  18. Human promoter genomic composition demonstrates non-random groupings that reflect general cellular function

    PubMed Central

    McNutt, Markey C; Tongbai, Ron; Cui, Wenwu; Collins, Irene; Freebern, Wendy J; Montano, Idalia; Haggerty, Cynthia M; Chandramouli, GVR; Gardner, Kevin

    2005-01-01

    Background The purpose of this study is to determine whether or not there exists nonrandom grouping of cis-regulatory elements within gene promoters that can be perceived independent of gene expression data and whether or not there is any correlation between this grouping and the biological function of the gene. Results Using ProSpector, a web-based promoter search and annotation tool, we have applied an unbiased approach to analyze the transcription factor binding site frequencies of 1400 base pair genomic segments positioned at 1200 base pairs upstream and 200 base pairs downstream of the transcriptional start site of 7298 commonly studied human genes. Partitional clustering of the transcription factor binding site composition within these promoter segments reveals a small number of gene groups that are selectively enriched for gene ontology terms consistent with distinct aspects of cellular function. Significance ranking of the class-determining transcription factor binding sites within these clusters show substantial overlap between the gene ontology terms of the transcriptions factors associated with the binding sites and the gene ontology terms of the regulated genes within each group. Conclusion Thus, gene sorting by promoter composition alone produces partitions in which the "regulated" and the "regulators" cosegregate into similar functional classes. These findings demonstrate that the transcription factor binding site composition is non-randomly distributed between gene promoters in a manner that reflects and partially defines general gene class function. PMID:16232321

  19. Functional Redundancy Patterns Reveal Non-Random Assembly Rules in a Species-Rich Marine Assemblage

    PubMed Central

    Guillemot, Nicolas; Kulbicki, Michel; Chabanet, Pascale; Vigliola, Laurent

    2011-01-01

    The relationship between species and the functional diversity of assemblages is fundamental in ecology because it contains key information on functional redundancy, and functionally redundant ecosystems are thought to be more resilient, resistant and stable. However, this relationship is poorly understood and undocumented for species-rich coastal marine ecosystems. Here, we used underwater visual censuses to examine the patterns of functional redundancy for one of the most diverse vertebrate assemblages, the coral reef fishes of New Caledonia, South Pacific. First, we found that the relationship between functional and species diversity displayed a non-asymptotic power-shaped curve, implying that rare functions and species mainly occur in highly diverse assemblages. Second, we showed that the distribution of species amongst possible functions was significantly different from a random distribution up to a threshold of ∼90 species/transect. Redundancy patterns for each function further revealed that some functions displayed fast rates of increase in redundancy at low species diversity, whereas others were only becoming redundant past a certain threshold. This suggested non-random assembly rules and the existence of some primordial functions that would need to be fulfilled in priority so that coral reef fish assemblages can gain a basic ecological structure. Last, we found little effect of habitat on the shape of the functional-species diversity relationship and on the redundancy of functions, although habitat is known to largely determine assemblage characteristics such as species composition, biomass, and abundance. Our study shows that low functional redundancy is characteristic of this highly diverse fish assemblage, and, therefore, that even species-rich ecosystems such as coral reefs may be vulnerable to the removal of a few keystone species. PMID:22039543

  20. Identification of non-random sequence properties in groups of signature peptides obtained in random sequence peptide microarray experiments.

    PubMed

    Kuznetsov, Igor B

    2016-05-01

    Immunosignaturing is an emerging experimental technique that uses random sequence peptide microarrays to detect antibodies produced by the immune system in response to a particular disease. Two important questions regarding immunosignaturing are "Do microarray peptides that exhibit a strong affinity to a given type of antibodies share common sequence properties?" and "If so, what are those properties?" In this work, three statistical tests designed to detect non-random patterns in the amino acid makeup of a group of microarray peptides are presented. One test detects patterns of significantly biased amino acid usage, whereas the other two detect patterns of significant bias in the biochemical properties. These tests do not require a large number of peptides per group. The tests were applied to analyze 19 groups of peptides identified in immunosignaturing experiments as being specific for antibodies produced in response to various types of cancer and other diseases. The positional distribution of the biochemical properties of the amino acids in these 19 peptide groups was also studied. Remarkably, despite the random nature of the sequence libraries used to design the microarrays, a unique group-specific non-random pattern was identified in the majority of the peptide groups studied. © 2016 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 106: 318-329, 2016. PMID:27037995

  1. Non-random structures in universal compression and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Gurzadyan, A. V.; Allahverdyan, A. E.

    2016-02-01

    We study the hypothesis of information panspermia assigned recently among possible solutions of the Fermi paradox ("where are the aliens?"). It suggests that the expenses of alien signaling can be significantly reduced, if their messages contained compressed information. To this end we consider universal compression and decoding mechanisms ( e.g. the Lempel-Ziv-Welch algorithm) that can reveal non-random structures in compressed bit strings. The efficiency of the Kolmogorov stochasticity parameter for detection of non-randomness is illustrated, along with the Zipf's law. The universality of these methods, i.e. independence from data details, can be principal in searching for intelligent messages.

  2. The quality of control groups in non-randomized studies published in Journal of Hand Surgery

    PubMed Central

    Johnson, Shepard P.; Malay, Sunitha; Chung, Kevin C.

    2016-01-01

    Purpose To evaluate control group selection in non-randomized studies published in the Journal of Hand Surgery American (JHS). Methods We reviewed all papers published in JHS in 2013 to identify studies that used non-randomized control groups. Data collected included type of study design and control group characteristics. We then appraised studies to determine if authors discussed confounding and selection bias and how they controlled for confounding. Results Thirty-seven non-randomized studies were published in JHS in 2013. The source of control was either the same institution as the study group, a different institution, a database, or not provided in the manuscript. Twenty-nine (78%) studies statistically compared key characteristics between control and study group. Confounding was controlled with matching, exclusion criteria, or regression analysis. Twenty-two (59%) papers explicitly discussed the threat of confounding and 18(49%) identified sources of selection bias. Conclusions In our review of non-randomized studies published in JHS, papers had well-defined controls that were similar to the study group, allowing for reasonable comparisons. However, we identified substantial confounding and bias that were not addressed as explicit limitations, which might lead the reader to overestimate the scientific validity of the data. Clinical relevance Incorporating a brief discussion of control group selection in scientific manuscripts should help readers interpret the study more appropriately. Authors, reviewers, and editors should strive to address this component of clinical importance. PMID:25447000

  3. Bayesian hierarchical modeling for a non-randomized, longitudinal fall prevention trial with spatially correlated observations

    PubMed Central

    Murphy, T. E.; Allore, H. G.; Leo-Summers, L.; Carlin, B. P.

    2012-01-01

    Because randomization of participants is often not feasible in community-based health interventions, non-randomized designs are commonly employed. Non-randomized designs may have experimental units that are spatial in nature, such as zip codes that are characterized by aggregate statistics from sources like the U.S. census and the Centers for Medicare and Medicaid Services. A perennial concern with non-randomized designs is that even after careful balancing of influential covariates, bias may arise from unmeasured factors. In addition to facilitating the analysis of interventional designs based on spatial units, Bayesian hierarchical modeling can quantify unmeasured variability with spatially correlated residual terms. Graphical analysis of these spatial residuals demonstrates whether variability from unmeasured covariates is likely to bias the estimates of interventional effect. The Connecticut Collaboration for Fall Prevention is the first large-scale longitudinal trial of a community-wide healthcare intervention designed to prevent injurious falls in older adults. Over a two-year evaluation phase, this trial demonstrated a rate of fall-related utilization at hospitals and emergency departments by persons 70 years and older in the intervention area that was 11 per cent less than that of the usual care area, and a 9 per cent lower rate of utilization from serious injuries. We describe the Bayesian hierarchical analysis of this non-randomized intervention with emphasis on its spatial and longitudinal characteristics. We also compare several models, using posterior predictive simulations and maps of spatial residuals. PMID:21294148

  4. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... complex medical review if a provider or supplier stops billing the code under review, shifts billing to... medical records, or engages in any other improper claims or billing-related activity to avoid non-random... complex medical review. 421.505 Section 421.505 Public Health CENTERS FOR MEDICARE & MEDICAID...

  5. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... prepayment complex medical review if a provider or supplier stops billing the code under review, shifts... requests for medical records, or engages in any other improper claims or billing-related activity to avoid... billing error are no longer suspended for non-random prepayment complex medical review. (d) Periodic...

  6. 42 CFR 421.505 - Termination and extension of non-random prepayment complex medical review.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... complex medical review if a provider or supplier stops billing the code under review, shifts billing to... medical records, or engages in any other improper claims or billing-related activity to avoid non-random... complex medical review. 421.505 Section 421.505 Public Health CENTERS FOR MEDICARE & MEDICAID...

  7. Non-random distribution of homo-repeats: links with biological functions and human diseases.

    PubMed

    Lobanov, Michail Yu; Klus, Petr; Sokolovsky, Igor V; Tartaglia, Gian Gaetano; Galzitskaya, Oxana V

    2016-01-01

    The biological function of multiple repetitions of single amino acids, or homo-repeats, is largely unknown, but their occurrence in proteins has been associated with more than 20 hereditary diseases. Analysing 122 bacterial and eukaryotic genomes, we observed that the number of proteins containing homo-repeats is significantly larger than expected from theoretical estimates. Analysis of statistical significance indicates that the minimal size of homo-repeats varies with amino acid type and proteome. In an attempt to characterize proteins harbouring long homo-repeats, we found that those containing polar or small amino acids S, P, H, E, D, K, Q and N are enriched in structural disorder as well as protein- and RNA-interactions. We observed that E, S, Q, G, L, P, D, A and H homo-repeats are strongly linked with occurrence in human diseases. Moreover, S, E, P, A, Q, D and T homo-repeats are significantly enriched in neuronal proteins associated with autism and other disorders. We release a webserver for further exploration of homo-repeats occurrence in human pathology at http://bioinfo.protres.ru/hradis/. PMID:27256590

  8. Non-random subcellular distribution of variant EKLF in erythroid cells

    SciTech Connect

    Quadrini, Karen J.; Gruzglin, Eugenia; Bieker, James J.

    2008-04-15

    EKLF protein plays a prominent role during erythroid development as a nuclear transcription factor. Not surprisingly, exogenous EKLF quickly localizes to the nucleus. However, using two different assays we have unexpectedly found that a substantial proportion of endogenous EKLF resides in the cytoplasm at steady state in all erythroid cells examined. While EKLF localization does not appear to change during either erythroid development or terminal differentiation, we find that the protein displays subtle yet distinct biochemical and functional differences depending on which subcellular compartment it is isolated from, with PEST sequences possibly playing a role in these differences. Localization is unaffected by inhibition of CRM1 activity and the two populations are not differentiated by stability. Heterokaryon assays demonstrate that EKLF is able to shuttle out of the nucleus although its nuclear re-entry is rapid. These studies suggest there is an unexplored role for EKLF in the cytoplasm that is separate from its well-characterized nuclear function.

  9. Non-random subcellular distribution of variant EKLF in erythroid cells.

    PubMed

    Quadrini, Karen J; Gruzglin, Eugenia; Bieker, James J

    2008-04-15

    EKLF protein plays a prominent role during erythroid development as a nuclear transcription factor. Not surprisingly, exogenous EKLF quickly localizes to the nucleus. However, using two different assays we have unexpectedly found that a substantial proportion of endogenous EKLF resides in the cytoplasm at steady state in all erythroid cells examined. While EKLF localization does not appear to change during either erythroid development or terminal differentiation, we find that the protein displays subtle yet distinct biochemical and functional differences depending on which subcellular compartment it is isolated from, with PEST sequences possibly playing a role in these differences. Localization is unaffected by inhibition of CRM1 activity and the two populations are not differentiated by stability. Heterokaryon assays demonstrate that EKLF is able to shuttle out of the nucleus although its nuclear re-entry is rapid. These studies suggest there is an unexplored role for EKLF in the cytoplasm that is separate from its well-characterized nuclear function. PMID:18329016

  10. Non-random distribution of homo-repeats: links with biological functions and human diseases

    PubMed Central

    Lobanov, Michail Yu.; Klus, Petr; Sokolovsky, Igor V.; Tartaglia, Gian Gaetano; Galzitskaya, Oxana V.

    2016-01-01

    The biological function of multiple repetitions of single amino acids, or homo-repeats, is largely unknown, but their occurrence in proteins has been associated with more than 20 hereditary diseases. Analysing 122 bacterial and eukaryotic genomes, we observed that the number of proteins containing homo-repeats is significantly larger than expected from theoretical estimates. Analysis of statistical significance indicates that the minimal size of homo-repeats varies with amino acid type and proteome. In an attempt to characterize proteins harbouring long homo-repeats, we found that those containing polar or small amino acids S, P, H, E, D, K, Q and N are enriched in structural disorder as well as protein- and RNA-interactions. We observed that E, S, Q, G, L, P, D, A and H homo-repeats are strongly linked with occurrence in human diseases. Moreover, S, E, P, A, Q, D and T homo-repeats are significantly enriched in neuronal proteins associated with autism and other disorders. We release a webserver for further exploration of homo-repeats occurrence in human pathology at http://bioinfo.protres.ru/hradis/. PMID:27256590

  11. Non-Random Mating, Parent-of-Origin, and Maternal-Fetal Incompatibility Effects in Schizophrenia

    PubMed Central

    Kim, Yunjung; Ripke, Stephan; Kirov, George; Sklar, Pamela; Purcell, Shaun; Owen, Michael; O’Donovan, Michael C.; Sullivan, Patrick F.

    2014-01-01

    Although the association of common genetic variation in the extended MHC region with schizophrenia is the most significant yet discovered, the MHC region is one of the more complex regions of the human genome, with unusually high gene density and long-range linkage disequilibrium. The statistical test on which the MHC association is based is a relatively simple, additive model which uses logistic regression of SNP genotypes to predict case-control status. However, it is plausible that more complex models underlie this association. Using a well-characterized sample of trios, we evaluated more complex models by looking for evidence for: (a) non-random mating for HLA alleles, schizophrenia risk profiles, and ancestry; (b) parent-of-origin effects for HLA alleles; and (c) maternal-fetal genotype incompatibility in the HLA. We found no evidence for non-random mating in the parents of individuals with schizophrenia in terms of MHC genotypes or schizophrenia risk profile scores. However, there was evidence of non-random mating that appeared mostly to be driven by ancestry. We did not detect over-transmission of HLA alleles to affected offspring via the general TDT test (without regard to parent of origin) or preferential transmission via paternal or maternal inheritance. We evaluated the hypothesis that maternal-fetal HLA incompatibility may increase risk for schizophrenia using eight classical HLA loci. The most significant alleles were in HLA-B, HLA-C, HLA-DQB1, and HLA-DRB1 but none was significant after accounting for multiple comparisons. We did not find evidence to support more complex models of gene action, but statistical power may have been limiting. PMID:23177929

  12. Non-random biodiversity loss underlies predictable increases in viral disease prevalence

    PubMed Central

    Lacroix, Christelle; Jolles, Anna; Seabloom, Eric W.; Power, Alison G.; Mitchell, Charles E.; Borer, Elizabeth T.

    2014-01-01

    Disease dilution (reduced disease prevalence with increasing biodiversity) has been described for many different pathogens. Although the mechanisms causing this phenomenon remain unclear, the disassembly of communities to predictable subsets of species, which can be caused by changing climate, land use or invasive species, underlies one important hypothesis. In this case, infection prevalence could reflect the competence of the remaining hosts. To test this hypothesis, we measured local host species abundance and prevalence of four generalist aphid-vectored pathogens (barley and cereal yellow dwarf viruses) in a ubiquitous annual grass host at 10 sites spanning 2000 km along the North American West Coast. In laboratory and field trials, we measured viral infection as well as aphid fecundity and feeding preference on several host species. Virus prevalence increased as local host richness declined. Community disassembly was non-random: ubiquitous hosts dominating species-poor assemblages were among the most competent for vector production and virus transmission. This suggests that non-random biodiversity loss led to increased virus prevalence. Because diversity loss is occurring globally in response to anthropogenic changes, such work can inform medical, agricultural and veterinary disease research by providing insights into the dynamics of pathogens nested within a complex web of environmental forces. PMID:24352672

  13. How does non-random spontaneous activity contribute to brain development?

    PubMed

    Thivierge, Jean-Philippe

    2009-09-01

    Highly non-random forms of spontaneous activity are proposed to play an instrumental role in the early development of the visual system. However, both the fundamental properties of spontaneous activity required to drive map formation, as well as the exact role of this information remain largely unknown. Here, a realistic computational model of spontaneous retinal waves is employed to demonstrate that both the amplitude and frequency of waves may play determining roles in retinocollicular map formation. Furthermore, results obtained with different learning rules show that spike precision in the order of milliseconds may be instrumental to neural development: a rule based on precise spike interactions (spike-timing-dependent plasticity) reduced the density of aberrant projections to the SC to a markedly greater extent than a rule based on interactions at much broader time-scale (correlation-based plasticity). Taken together, these results argue for an important role of spontaneous yet highly non-random activity, along with temporally precise learning rules, in the formation of neural circuits. PMID:19196491

  14. Non-random mating in classical lekking grouse species: seasonal and diurnal trends

    NASA Astrophysics Data System (ADS)

    Tsuji, L. J. S.; DeIuliis, G.; Hansell, R. I. C.; Kozlovic, D. R.; Sokolowski, M. B.

    This paper is the first to integrate both field and theoretical approaches to demonstrate that fertility benefits can be a direct benefit to females mating on the classical lek. Field data collected for male sharp-tailed grouse (Tympanuchus phasianellus), a classical lekking species, revealed potential fertility benefits for selective females. Adult males and individuals occupying centrally located territories on the lek were found to have significantly larger testes than juveniles and peripheral individuals. Further, using empirical data from previously published studies of classical lekking grouse species, time-series analysis was employed to illustrate that female mating patterns, seasonal and daily, were non-random. We are the first to show that these patterns coincide with times when male fertility is at its peak.

  15. Non-random aneuploidy specifies subgroups of pilocytic astrocytoma and correlates with older age

    PubMed Central

    Khuong-Quang, Dong-Anh; Bechet, Denise; Gayden, Tenzin; Kool, Marcel; De Jay, Nicolas; Jacob, Karine; Gerges, Noha; Hutter, Barbara; Şeker-Cin, Huriye; Witt, Hendrik; Montpetit, Alexandre; Brunet, Sébastien; Lepage, Pierre; Bourret, Geneviève; Klekner, Almos; Bognár, László; Hauser, Peter; Garami, Miklós; Farmer, Jean-Pierre; Montes, Jose-Luis; Atkinson, Jeffrey; Lambert, Sally; Kwan, Tony; Korshunov, Andrey; Tabori, Uri; Collins, V. Peter; Albrecht, Steffen; Faury, Damien; Pfister, Stefan M.; Paulus, Werner; Hasselblatt, Martin; Jones, David T.W.; Jabado, Nada

    2015-01-01

    Pilocytic astrocytoma (PA) is the most common brain tumor in children but is rare in adults, and hence poorly studied in this age group. We investigated 222 PA and report increased aneuploidy in older patients. Aneuploid genomes were identified in 45% of adult compared with 17% of pediatric PA. Gains were non-random, favoring chromosomes 5, 7, 6 and 11 in order of frequency, and preferentially affecting non-cerebellar PA and tumors with BRAF V600E mutations and not with KIAA1549-BRAF fusions or FGFR1 mutations. Aneuploid PA differentially expressed genes involved in CNS development, the unfolded protein response, and regulators of genomic stability and the cell cycle (MDM2, PLK2),whose correlated programs were overexpressed specifically in aneuploid PA compared to other glial tumors. Thus, convergence of pathways affecting the cell cycle and genomic stability may favor aneuploidy in PA, possibly representing an additional molecular driver in older patients with this brain tumor. PMID:26378811

  16. Non-random aneuploidy specifies subgroups of pilocytic astrocytoma and correlates with older age.

    PubMed

    Fontebasso, Adam M; Shirinian, Margret; Khuong-Quang, Dong-Anh; Bechet, Denise; Gayden, Tenzin; Kool, Marcel; De Jay, Nicolas; Jacob, Karine; Gerges, Noha; Hutter, Barbara; Şeker-Cin, Huriye; Witt, Hendrik; Montpetit, Alexandre; Brunet, Sébastien; Lepage, Pierre; Bourret, Geneviève; Klekner, Almos; Bognár, László; Hauser, Peter; Garami, Miklós; Farmer, Jean-Pierre; Montes, Jose-Luis; Atkinson, Jeffrey; Lambert, Sally; Kwan, Tony; Korshunov, Andrey; Tabori, Uri; Collins, V Peter; Albrecht, Steffen; Faury, Damien; Pfister, Stefan M; Paulus, Werner; Hasselblatt, Martin; Jones, David T W; Jabado, Nada

    2015-10-13

    Pilocytic astrocytoma (PA) is the most common brain tumor in children but is rare in adults, and hence poorly studied in this age group. We investigated 222 PA and report increased aneuploidy in older patients. Aneuploid genomes were identified in 45% of adult compared with 17% of pediatric PA. Gains were non-random, favoring chromosomes 5, 7, 6 and 11 in order of frequency, and preferentially affecting non-cerebellar PA and tumors with BRAF V600E mutations and not with KIAA1549-BRAF fusions or FGFR1 mutations. Aneuploid PA differentially expressed genes involved in CNS development, the unfolded protein response, and regulators of genomic stability and the cell cycle (MDM2, PLK2),whose correlated programs were overexpressed specifically in aneuploid PA compared to other glial tumors. Thus, convergence of pathways affecting the cell cycle and genomic stability may favor aneuploidy in PA, possibly representing an additional molecular driver in older patients with this brain tumor. PMID:26378811

  17. Non-random nectar unloading interactions between foragers and their receivers in the honeybee hive

    NASA Astrophysics Data System (ADS)

    Goyret, Joaquín; Farina, Walter M.

    2005-09-01

    Nectar acquisition in the honeybee Apis mellifera is a partitioned task in which foragers gather nectar and bring it to the hive, where nest mates unload via trophallaxis (i.e. mouth-to-mouth transfer) the collected food for further storage. Because forager mates exploit different feeding places simultaneously, this study addresses the question of whether nectar unloading interactions between foragers and hive-bees are established randomly, as it is commonly assumed. Two groups of foragers were trained to exploit a different scented food source for 5 days. We recorded their trophallaxes with hive-mates, marking the latter ones according to the forager group they were unloading. We found non-random probabilities for the occurrence of trophallaxes between experimental foragers and hive-bees, instead, we found that trophallactic interactions were more likely to involve groups of individuals which had formerly interacted orally. We propose that olfactory cues present in the transferred nectar promoted the observed bias, and we discuss this bias in the context of the organization of nectar acquisition: a partitioned task carried out in a decentralized insect society.

  18. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. PMID:26524140

  19. Non-random fragmentation patterns in circulating cell-free DNA reflect epigenetic regulation

    PubMed Central

    2015-01-01

    Background The assessment of cell-free circulating DNA fragments, also known as a "liquid biopsy" of the patient's plasma, is an important source for the discovery and subsequent non-invasive monitoring of cancer and other pathological conditions. Although the nucleosome-guided fragmentation patterns of cell-free DNA (cfDNA) have not yet been studied in detail, non-random representation of cfDNA sequencies may reflect chromatin features in the tissue of origin at gene-regulation level. Results In this study, we investigated the association between epigenetic landscapes of human tissues evident in the patterns of cfDNA in plasma by deep sequencing of human cfDNA samples. We have demonstrated that baseline characteristics of cfDNA fragmentation pattern are in concordance with the ones corresponding to cell lines-derived. To identify the loci differentially represented in cfDNA fragment, we mapped the transcription start sites within the sequenced cfDNA fragments and tested for association of these genomic coordinates with the relative strength and the patterns of gene expressions. Preselected sets of house-keeping and tissue specific genes were used as models for actively expressed and silenced genes. Developed measure of gene regulation was able to differentiate these two sets based on sequencing coverage near gene transcription start site. Conclusion Experimental outcomes suggest that cfDNA retains characteristics previously noted in genome-wide analysis of chromatin structure, in particular, in MNase-seq assays. Thus far the analysis of the DNA fragmentation pattern may aid further developing of cfDNA based biomarkers for a variety of human conditions. PMID:26693644

  20. Intrauterine synechiae after myomectomy; laparotomy versus laparoscopy: Non-randomized interventional trial

    PubMed Central

    Asgari, Zahra; Hafizi, Leili; Hosseini, Rayhaneh; Javaheri, Atiyeh; Rastad, Hathis

    2015-01-01

    Background: Leiomyomata is the most frequent gynecological neoplasm. One of the major complications of myomectomy is intrauterine adhesion (synechiae). Objective: To evaluate and compare the rate and severity of synechiae formation after myomectomy by laparotomy and laparoscopy. Materials and Methods: In this non-randomized interventional trial, hysteroscopy was performed in all married fertile women who had undergone myomectomy (type 3-6 interamural and subserosal fibroids) via laparotomy and laparoscopy in Tehran’s Arash Hospital from 2010 to 2013. Three months after the operation, the occurrence rate and severity of intrauterine synechiae, and its relationship with type, number and location of myomas were investigated and compared in both groups. Results: Forty patients (19 laparoscopy and 21 laparotomy cases) were studied. Both groups were similar regarding the size, type (subserosal or intramural), number and location of myoma. The occurrence rate of synechiae in the laparoscopy and laparotomy group was 21% and 19%, respectively; showing no significant difference (p=0.99). Among all patients, no significant relationship was found between the endometrial opening (p=0.92), location (p=0.14) and type of myoma (p=0.08) with the occurrence rate of synechiae. However, a significant relationship was observed between myoma’s size (p=0.01) and the location of the largest myoma with the occurrence of synechiae (p=0.02). Conclusion: With favorable suturing methods, the outcome of intrauterine synechiae formation after myomectomy, either performed by laparotomy or laparoscopy, is similar. In all cases of myomectomy in reproductive-aged women, postoperative hysteroscopy is highly recommended to better screen intrauterine synechiae. PMID:26000007

  1. Non-random correlation structures and dimensionality reduction in multivariate climate data

    NASA Astrophysics Data System (ADS)

    Vejmelka, Martin; Pokorná, Lucie; Hlinka, Jaroslav; Hartman, David; Jajcay, Nikola; Paluš, Milan

    2015-05-01

    It is well established that the global climate is a complex phenomenon with dynamics driven by the interaction of a multitude of identifiable but intertwined subsystems. The identification, at some level, of these subsystems is an important step towards understanding climate dynamics. We present a method to determine the number of principal components representing non-random correlation structures in climate data, or components that cannot be generated by a surrogate model of independent stochastic processes replicating the auto-correlation structure of each time series. The purpose of the method is to automatically reduce the dimensionality of large climate datasets into spatially localised components suitable for further interpretation or, for example, for use as nodes in a complex network analysis of large-scale climate dynamics. We apply the method to two 2.5° resolution NCEP/NCAR reanalysis global datasets of monthly means: the sea level pressure (SLP) and the surface air temperature (SAT), and extract 60 components explaining 87 % variance and 68 components explaining 72 % variance, respectively. The obtained components are in agreement with previous results in that they recover many well-known climate modes previously identified using other approaches including regionally constrained principal component analysis. Selected SLP components are discussed in more detail with respect to their correlation with important climate indices and their relationship to other SLP and SAT components. Finally, we consider a subset of the obtained components that have not yet been explicitly identified by other authors but seem plausible in the context of regional climate observations discussed in literature.

  2. A 4-Mb deletion in the region Xq27.3-q28 is associated with non-random inactivation of the non-mutant X chromosome

    SciTech Connect

    Clarke, J.T.R.; Han, L.P.; Michalickova, K.

    1994-09-01

    A girl with severe Hunter disease was found to have a submicroscopic deletion distrupting the IDS locus in the region Xq27.3-q28 together with non-random inactivation of the non-mutant X chromosome. Southern analysis of DNA from the parents and from hamster-patient somatic cell hybrids containing only the mutant X chromosome revealed that the deletion represented a de novo mutation involving the paternal X chromosome. Methylation-sensitive RFLP analysis of DNA from maternal fibroblasts and lymphocytes showed methylation patterns consistent with random X-inactivation, indicating that the non-random X-inactivation in the patient was not inherited and was likely a direct result of the Xq27.3-q28 deletion. A 15 kb EcoRI junction fragment, identified in patient DNA using IDS cDNA probes, was cloned from a size-selected patient DNA library. Clones containing the deletion junction were restriction mapped and fragments were subcloned and used to isolate normal sequence on either side of the deletion from normal X chromosome libraries. Comparison of the sequences from normal and mutant X chromosome clones straddling the deletion breakpoint showed that the mutation had occurred by recombination between Alu repeats. Screening of YAC contigs containing normal X chromosome sequence from the region of the mutation, using probes from either side of the deletion breakpoint, showed that the deletion was approximately 4 Mb in size. Probing of mutant DNA with 16 STSs distributed throughout the region of the deletion confirmed that the mutation is a simple deletion with no complex rearrangements of islands of retained DNA. A search for sequences at Xq27.3-q28 involved in X chromosome inactivation is in progress.

  3. Cell adhesion geometry regulates non-random DNA segregation and asymmetric cell fates in mouse skeletal muscle stem cells.

    PubMed

    Yennek, Siham; Burute, Mithila; Théry, Manuel; Tajbakhsh, Shahragim

    2014-05-22

    Cells of several metazoan species have been shown to non-randomly segregate their DNA such that older template DNA strands segregate to one daughter cell. The mechanisms that regulate this asymmetry remain undefined. Determinants of cell fate are polarized during mitosis and partitioned asymmetrically as the spindle pole orients during cell division. Chromatids align along the pole axis; therefore, it is unclear whether extrinsic cues that determine spindle pole position also promote non-random DNA segregation. To mimic the asymmetric divisions seen in the mouse skeletal stem cell niche, we used micropatterns coated with extracellular matrix in asymmetric and symmetric motifs. We show that the frequency of non-random DNA segregation and transcription factor asymmetry correlates with the shape of the motif and that these events can be uncoupled. Furthermore, regulation of DNA segregation by cell adhesion occurs within a defined time interval. Thus, cell adhesion cues have a major impact on determining both DNA segregation patterns and cell fates. PMID:24836002

  4. Does homeobox-related "positional" genomic information contribute to implantation of metastatic cancer cells at non-random sites?

    PubMed

    Anderson, K M; Darweesh, M; Jajah, A; Tsui, P; Guinan, P; Rubenstein, M

    2007-01-01

    Reasons for the lodgment of metastases from several types of solid cancer at apparently non-random sites have not been established. Recently, a group of genes expressed in human fibroblasts obtained from different anatomic locations was implicated in "positional" genomic information. Essentially, a Cartesian coordinate system identifying fibroblasts originally resident at anterior or more posterior, proximal or distal and dermal or non-dermal (heart, lung, etc.) locations was proposed. The determinants used for these identifications included HOX genes, central to embryonic segmental development, some of which are expressed in differentiated, post-embryonic cells. To the extent that HOX or other homeobox genes are expressed in ectodermal, mesodermal or endodermally-derived, malignantly transformed cells, they might contribute "positional" information to nidation of specific malignant clones at non-random sites. As understood in the past, interdiction of HOX or homeobox-related gene expression might reduce the probability of cancer cell implantation or alter their destinations in complex ways. Ideally, by interfering with HOX or other homeobox gene-related expression of antigenic determinants potentially contributing to their "homing" and nidation, reduced implantation of circulating cancer cells could render them more susceptible to systemic chemotherapy or immunotherapy, as demonstrated in mice. Furthermore, HOX or other homeobox genes or their products could provide novel intra- or extracellular targets for therapy. PMID:17695497

  5. Tables of critical values for examining compositional non-randomness in proteins and nucleic acids

    NASA Technical Reports Server (NTRS)

    Laird, M.; Holmquist, R.

    1975-01-01

    A binomially distributed statistic is defined to show whether or not the proportion of a particular amino acid in a protein deviates from random expectation. An analogous statistic is derived for nucleotides in nucleic acids. These new statistics are simply related to the classical chi-squared test. They explicitly account for the compositional fluctuations imposed by the finite length of proteins, and they are more accurate than previous tables.

  6. Methodological issues in observational studies and non-randomized controlled trials in oncology in the era of big data.

    PubMed

    Tanaka, Shiro; Tanaka, Sachiko; Kawakami, Koji

    2015-04-01

    Non-randomized controlled trials, cohort studies and database studies are appealing study designs when there are urgent needs for safety data, outcomes of interest are rare, generalizability is a matter of concern, or randomization is not feasible. This paper reviews four typical case studies from methodological viewpoints and clarifies how to minimize bias in observational studies in oncology. In summary, researchers planning observational studies should be cautious of selection of appropriate databases, validity of algorithms for identifying outcomes, comparison with incident users or self-control, rigorous collection of information on potential confounders and reporting details of subject selection. Further, a careful study protocol and statistical analysis plan are also necessary. PMID:25589456

  7. The Origin of Aging: Imperfectness-Driven Non-Random Damage Defines the Aging Process and Control of Lifespan

    PubMed Central

    Gladyshev, Vadim N.

    2013-01-01

    Physico-chemical properties preclude ideal biomolecules and perfect biological functions. This inherent imperfectness leads to the generation of damage by every biological process, at all levels, from small molecules to cells. The damage is too numerous to be repaired, is partially invisible to natural selection and manifests as aging. I propose that it is the inherent imperfectness of biological systems that is the true root of the aging process. As each biomolecule generates specific forms of damage, the cumulative damage is largely non-random and is indirectly encoded in the genome. I consider this concept in light of other proposed theories of aging and integrate these disparate ideas into a single model. I also discuss the evolutionary significance of damage accumulation and strategies for reducing damage. Finally, I suggest ways to test this integrated model of aging. PMID:23769208

  8. Spirals in space - non-random orientation of moss protonemata in microgravity (STS-87)

    NASA Astrophysics Data System (ADS)

    Kern, V.; Sack, F.

    Protonemata of the moss Ceratodon purpureus are an excellent system for studying gravitropism and phototropis in a tip-growing cell. In darkness protonemata express negative gravitropism (they grow up) with high fidelity. When irradiated they accurately align in the light path. When grown in darkness under microgravity conditions (STS-87, Nov./Dec. 1997), 7-day old cultures displayed a predominately radial orientation. However, in older (14 d) cultures the protonemata grew in arcs and overall formed clockwise spirals. Cultures grown on a slow-rotating clinostat for 14 days also expressed spirals. Spirals were mostly clockwise and formed regardless of the orientation with respect to the acceleration force (speed of clinostat rotation) or to the direction of rotation. The presence of spirals in 14 d but not 7 d cultures could be due to culture age, stage, or size and/or to the duration of exposure to microgravity or clino-rotation. The phenomenon of protonemal phototropism allowed us to investigate this further. When unilaterally irradiated for 7 days, cultures displayed negative and positive phototropism while gravitropism was suppressed; in these cultures almost all cells were aligned in a straight line along the light path. When such cultures were transferred to darkness for an additional 7 d, clockwise arcs and spirals formed. Thus, spiral formation requires only a 7-day dose of microgravity or clino-rotation, as long as the cultures are of a sufficient age or stage (7 days or less). The presence of coordinated clockwise spiral growth in μg suggests that there is an endogenous growth polarity in Ceratodon that normally is suppressed by gravitropism. A working hypothesis is that the spirals represent a residual spacing mechanism for controlling colony growth and the distribution of side branches. (Supported by NASA: NAG10-017).

  9. Co-occurrence analyses show that non-random community structure is disrupted by fire in two groups of soil arthropods (Isopoda Oniscidea and Collembola)

    NASA Astrophysics Data System (ADS)

    Pitzalis, Monica; Luiselli, Luca; Bologna, Marco A.

    2010-01-01

    In this paper, we tested the hypothesis that natural catastrophes may destroy non-random community structure in natural assemblages of organisms. As a study system, we selected fire as the catastrophic event, and two groups of soil arthropods (Collembola and Isopoda Oniscidea) as target organisms. By co-occurrence analyses and Monte Carlo simulations of niche overlap analysis (C-score, with fixed-equiprobable model; RA2 and RA3 algorithms) we evaluated whether the community structure of these two groups were random/non-random at three unburnt sites and at three neighbour burnt sites that were devastated by a large-scale fire in summer 2000. Both taxa experienced a remarkable reduction in the number of species sampled in burnt versus unburnt sites, but the difference among sites was not statistically significant for Oniscidea. We determined that community structure was clearly non-random at the unburnt sites for both Collembola (according to RA3 algorithm) and Isopoda Oniscidea (according to co-occurrence analysis) and that, as predicted by theory, the catastrophic event did deeply alter the community structure by removing the non-random organization of the species interactions. We also observed a shift from segregation to aggregation/randomness in soil arthropods communities affected by fire, a pattern that was similar to that observed in natural communities of organisms perturbed by the introduction of alien species, thus indicating that this pattern may be generalizable when alteration of communities may occur.

  10. A school intervention for mental health literacy in adolescents: effects of a non-randomized cluster controlled trial

    PubMed Central

    2013-01-01

    Background “Mental health for everyone” is a school program for mental health literacy and prevention aimed at secondary schools (13–15 yrs). The main aim was to investigate whether mental health literacy, could be improved by a 3-days universal education programme by: a) improving naming of symptom profiles of mental disorder, b) reducing prejudiced beliefs, and c) improving knowledge about where to seek help for mental health problems. A secondary aim was to investigate whether adolescent sex and age influenced the above mentioned variables. A third aim was to investigate whether prejudiced beliefs influenced knowledge about available help. Method This non-randomized cluster controlled trial included 1070 adolescents (53.9% boys, M age14 yrs) from three schools in a Norwegian town. One school (n = 520) received the intervention, and two schools (n = 550) formed the control group. Pre-test and follow-up were three months apart. Linear mixed models and generalized estimating equations models were employed for analysis. Results Mental health literacy improved contingent on the intervention, and there was a shift towards suggesting primary health care as a place to seek help. Those with more prejudiced beleifs did not suggest places to seek help for mental health problems. Generally, girls and older adolescents recognized symptom profiles better and had lower levels of prejudiced beliefs. Conclusions A low cost general school program may improve mental health literacy in adolescents. Gender specific programs and attention to the age and maturity of the students should be considered when mental health literacy programmes are designed and tried out. Prejudice should be addressed before imparting information about mental health issues. PMID:24053381

  11. Telomere Disruption Results in Non-Random Formation of De Novo Dicentric Chromosomes Involving Acrocentric Human Chromosomes

    PubMed Central

    Stimpson, Kaitlin M.; Song, Ihn Young; Jauch, Anna; Holtgreve-Grez, Heidi; Hayden, Karen E.; Bridger, Joanna M.; Sullivan, Beth A.

    2010-01-01

    Genome rearrangement often produces chromosomes with two centromeres (dicentrics) that are inherently unstable because of bridge formation and breakage during cell division. However, mammalian dicentrics, and particularly those in humans, can be quite stable, usually because one centromere is functionally silenced. Molecular mechanisms of centromere inactivation are poorly understood since there are few systems to experimentally create dicentric human chromosomes. Here, we describe a human cell culture model that enriches for de novo dicentrics. We demonstrate that transient disruption of human telomere structure non-randomly produces dicentric fusions involving acrocentric chromosomes. The induced dicentrics vary in structure near fusion breakpoints and like naturally-occurring dicentrics, exhibit various inter-centromeric distances. Many functional dicentrics persist for months after formation. Even those with distantly spaced centromeres remain functionally dicentric for 20 cell generations. Other dicentrics within the population reflect centromere inactivation. In some cases, centromere inactivation occurs by an apparently epigenetic mechanism. In other dicentrics, the size of the α-satellite DNA array associated with CENP-A is reduced compared to the same array before dicentric formation. Extra-chromosomal fragments that contained CENP-A often appear in the same cells as dicentrics. Some of these fragments are derived from the same α-satellite DNA array as inactivated centromeres. Our results indicate that dicentric human chromosomes undergo alternative fates after formation. Many retain two active centromeres and are stable through multiple cell divisions. Others undergo centromere inactivation. This event occurs within a broad temporal window and can involve deletion of chromatin that marks the locus as a site for CENP-A maintenance/replenishment. PMID:20711355

  12. Effectiveness of a 'Global Postural Reeducation' program for persistent Low Back Pain: a non-randomized controlled trial

    PubMed Central

    2010-01-01

    Background The aim of this non-randomized controlled trial was to evaluate the effectiveness of a Global Postural Reeducation (GPR) program as compared to a Stabilization Exercise (SE) program in subjects with persistent low back pain (LBP) at short- and mid-term follow-up (ie. 3 and 6 months). Methods According to inclusion and exclusion criteria, 100 patients with a primary complaint of persistent LBP were enrolled in the study: 50 were allocated to the GPR group and 50 to the SE group. Primary outcome measures were Roland and Morris Disability Questionnaire (RMDQ) and Oswestry Disability Index (ODI). Secondary outcome measures were lumbar Visual Analogue Scale (VAS) and Fingertip-to-floor test (FFT). Data were collected at baseline and at 3/6 months by health care professionals unaware of the study. An intention to treat approach was used to analyze participants according to the group to which they were originally assigned. Results Of the 100 patients initially included in the study, 78 patients completed the study: 42 in the GPR group and 36 in the SE group. At baseline, the two groups did not differ significantly with respect to gender, age, BMI and outcome measures. Comparing the differences between groups at short- and mid-term follow-up, the GPR group revealed a significant reduction (from baseline) in all outcome measures with respect to the SE group. The ordered logistic regression model showed an increased likelihood of definitive improvement (reduction from baseline of at least 30% in RMDQ and VAS scores) for the GPR group compared to the SE group (OR 3.9, 95% CI 2.7 to 5.7). Conclusions Our findings suggest that a GPR intervention in subjects with persistent LBP induces a greater improvement on pain and disability as compared to a SE program. These results must be confirmed by further studies with higher methodological standards, including randomization, larger sample size, longer follow-up and subgrouping of the LBP subjects. Trial registration NCT

  13. Non-random pre-transcriptional evolution in HIV-1. A refutation of the foundational conditions for neutral evolution

    PubMed Central

    2009-01-01

    The complete base sequence of HIV-1 virus and GP120 ENV gene were analyzed to establish their distance to the expected neutral random sequence. An especial methodology was devised to achieve this aim. Analyses included: a) proportion of dinucleotides (signatures); b) homogeneity in the distribution of dinucleotides and bases (isochores) by dividing both segments in ten and three sub-segments, respectively; c) probability of runs of bases and No-bases according to the Bose-Einstein distribution. The analyses showed a huge deviation from the random distribution expected from neutral evolution and neutral-neighbor influence of nucleotide sites. The most significant result is the tremendous lack of CG dinucleotides (p < 10-50 ), a selective trait of eukaryote and not of single stranded RNA virus genomes. Results not only refute neutral evolution and neutral neighbor influence, but also strongly indicate that any base at any nucleotide site correlates with all the viral genome or sub-segments. These results suggest that evolution of HIV-1 is pan-selective rather than neutral or nearly neutral. PMID:21637663

  14. Dynamics and Plastic Flow of Vortices Interacting with Strong Columnar Defects and the Enhancement of Jc by a Non-Random Spatial Distribution of Pins

    NASA Astrophysics Data System (ADS)

    Reichhardt, C.; Olson, C. J.; Groth, J.; Field, S.; Nori, F.

    1996-03-01

    We present MD simulations of flux-gradient-driven vortices (i.e., the force on the vortices is not uniform) in the critical state, interacting with randomly-placed strong columnar pins (i.e., the Bose glass regime) as an external field H(t) is quasi-statically swept from zero through the matching field B_φ. We analyze several measurable quantities, including global (e.g., M(H), J_c(H)) and local (e.g., B(x,y,H(t)), M(x,y,H(t)), J_c(x,y,B)) measurable quantities. We find a sharp change in the behaviour of these quantities as the local flux density crosses B_φ and quantify it for many microscopic pinning parameters. Further, we find that for a given pin density the critical current can be substantially enhanced by maximizing the distance between the pins for B < B_φ . We also monitor the dynamics of individual vortex flow paths (``vortex streets" surrounded by regions of pinned flux) in samples subject to a controlled change in the pinning parameters (e.g., increasing the pinning strength).

  15. Analysis of emplaced waste data and implications of non-random emplacement for performance assessment for the WIPP

    SciTech Connect

    Allen, Lawrence E.; Channell, James K.

    2003-05-31

    The WIPP Land Withdrawal Act recognized that after the initial certification of the WIPP and start of disposal operations, operating experience and ongoing research would result in new technical and scientific information. The Environmental Evaluation Group (EEG) has previously reported on issues that it considers important as the Department of Energy (DOE) works towards the first recertification. One of these issues involves the assumption of random emplacement of waste used in the performance assessment calculations in support of the initial certification application. As actual waste emplacement data are now available from four years of disposal, the EEG performed an analysis to evaluate the validity of that initial assumption and determine implications for performance assessment. Panel 1 was closed in March 2003. The degree of deviation between actual emplaced waste in Panel 1 and an assumption of random emplacement is apparent with concentrations of 239Pu being 3.20 times, 240Pu being 2.67 times, and 241Am being 4.13 times the projected repository average for the space occupied by the waste. A spatial statistical analysis was performed using available Panel 1 data retrieved from the WWIS and assigned room coordinates by Sandia National Laboratories. A comparison was made between the waste as emplaced and a randomization of the same waste. Conversely, the distribution of waste as emplaced is similar to the distribution of waste in the individual containers and can be characterized as bi-modal and skewed with a long high-concentration tail. The distribution of randomized waste is fairly symmetrical, as would be expected from classical statistical theory. In the event of a future drilling intrusion, comparison of these two distributions shows a higher probability of intersecting a high-concentration stack of the actual emplaced waste, over that of the same waste emplaced in a randomized manner as was assumed in the certified

  16. Identifying Baseline Covariates for Use in Propensity Scores: A Novel Approach Illustrated for a Non-randomized Study of Recovery High Schools

    PubMed Central

    Tanner-Smith, Emily E.; Lipsey, Mark W.

    2014-01-01

    There are many situations where random assignment of participants to treatment and comparison conditions may be unethical or impractical. This article provides an overview of propensity score techniques that can be used for estimating treatment effects in non-randomized quasi-experimental studies. After reviewing the logic of propensity score methods, we call attention to the importance of the strong ignorability assumption and its implications. We then discuss the importance of identifying and measuring a sufficient set of baseline covariates upon which to base the propensity scores and illustrate approaches to that task in the design of a study of recovery high schools for adolescents treated for substance abuse. One novel approach for identifying important covariates that we suggest and demonstrate is to draw on the predictor-outcome correlations compiled in meta-analyses of prospective longitudinal correlations. PMID:25071297

  17. Inbreeding and purging at the genomic Level: the Chillingham cattle reveal extensive, non-random SNP heterozygosity.

    PubMed

    Williams, J L; Hall, S J G; Del Corvo, M; Ballingall, K T; Colli, L; Ajmone Marsan, P; Biscarini, F

    2016-02-01

    Local breeds of livestock are of conservation significance as components of global biodiversity and as reservoirs of genetic variation relevant to the future sustainability of agriculture. One such rare historic breed, the Chillingham cattle of northern England, has a 350-year history of isolation and inbreeding yet shows no diminution of viability or fertility. The Chillingham cattle have not been subjected to selective breeding. It has been suggested previously that the herd has minimal genetic variation. In this study, high-density SNP genotyping with the 777K SNP chip showed that 9.1% of loci on the chip are polymorphic in the herd, compared with 62-90% seen in commercial cattle breeds. Instead of being homogeneously distributed along the genome, these loci are clustered at specific chromosomal locations. A high proportion of the Chillingham individuals examined were heterozygous at many of these polymorphic loci, suggesting that some loci are under balancing selection. Some of these frequently heterozygous loci have been implicated as sites of recessive lethal mutations in cattle. Linkage disequilibrium equal or close to 100% was found to span up to 1350 kb, and LD was above r(2) = 0.25 up to more than 5000 kb. This strong LD is consistent with the lack of polymorphic loci in the herd. The heterozygous regions in the Chillingham cattle may be the locations of genes relevant to fitness or survival, which may help elucidate the biology of local adaptation in traditional breeds and facilitate selection for such traits in commercial cattle. PMID:26559490

  18. Estimating fitness consequences of dispersal: a road to 'know-where'? Non-random dispersal and the underestimation of dispersers' fitness.

    PubMed

    Doligez, Blandine; Pärt, Tomas

    2008-11-01

    philopatric and dispersing individuals, others, based on adult and juvenile survival, are open to the alternative explanation of biased fitness estimates. 6. We list three potential ways of reducing the risk of making wrong inferences on biased fitness estimates due to such non-random dispersal behaviour between dispersing and philopatric individuals: (a) diagnosing effects of non-random dispersal, (b) reducing the effects of spatially limited study area and (c) performing controlled experiments. PMID:18808435

  19. Novel gene arrangement in the mitochondrial genome of Bothus myriaster (Pleuronectiformes: Bothidae): evidence for the Dimer-Mitogenome and Non-random Loss model.

    PubMed

    Gong, Li; Shi, Wei; Yang, Min; Li, Donghe; Kong, Xiaoyu

    2016-09-01

    In the present study, the complete mitochondrial genome of the oval flounder Bothus myriaster was determined and a novel gene rearrangement was discovered. A striking finding was that eight genes encoded by the L-strand (ND6 and tRNA-Q, A, C, Y, S1, E, and P genes) were translocated to a position between tRNA-T and tRNA-F. Simultaneously, the original order of the rearranged genes Q-A-C-Y-S1-ND6-E-P was maintained. Thus, the genes with identical transcriptional polarities were clustered in the genome with one exception that tRNA-N gene is located in the original position. The other rearrangement was the H-strand-encoded tRNA-D that translocated from its typical location between COI and COII to a position between S1 and ND6. The Dimer-Mitogenome and Non-random Loss model (DMNL)was adopted to account for the novel rearrangement in B. myriaster mitogenome, which provides evidence to support the hypothesis of the DMNL model. PMID:25629499

  20. Design and Baseline Findings of a Multi-site Non-randomized Evaluation of the Effect of a Health Programme on Microfinance Clients in India

    PubMed Central

    Saha, Somen

    2014-01-01

    Microfinance is the provision of financial services for the poor. Health program through microfinance has the potential to address several access barriers to health. We report the design and baseline findings of a multi-site non-randomized evaluation of the effect of a health program on the members of two microfinance organizations from Karnataka and Gujarat states of India. Villages identified for roll-out of health services with microfinance were pair-matched with microfinance only villages. A quantitative survey at inception and twelve months post health intervention compare the primary outcome (incidence of childhood diarrhea), and secondary outcome (place of last delivery, toilet at home, and out-of-pocket expenditure on treatment). At baseline, the intervention and comparison communities were similar except for out-of-pocket expenditure on health. Low reported use of toilet at home indicates the areas are heading towards a sanitation crisis. This should be an area of program priority for the microfinance organizations. While respondents primarily rely on their savings for meeting treatment expenditure, borrowing from friends, relatives, and money-lenders remains other important source of meeting treatment expenditure in the community. Programs need to prioritize steps to ensure awareness about national health insurance schemes, entitlement to increase service utilization, and developing additional health financing safety nets for financing outpatient care, that are responsible for majority of health-debt. Finally we discuss implications of such programs for national policy makers. PMID:24373263

  1. Feasibility and effectiveness of a combined individual and psychoeducational group intervention in psychiatric residential facilities: A controlled, non-randomized study.

    PubMed

    Magliano, Lorenza; Puviani, Marta; Rega, Sonia; Marchesini, Nadia; Rossetti, Marisa; Starace, Fabrizio

    2016-01-30

    This controlled, non-randomized study explored the feasibility of introducing a Combined Individual and Group Intervention (CIGI) for users with mental disorders in residential facilities, and tested whether users who received the CIGI had better functioning than users who received the Treatment-As-Usual (TAU), at two-year follow up. In the CIGI, a structured cognitivebehavioral approach called VADO (in English, Skills Assessment and Definition of Goals) was used to set specific goals with each user, while Falloon's psychoeducational treatment was applied with the users as a group. Thirty-one professionals attended a training course in CIGI, open to users' voluntary participation, and applied it for two years with all users living in 8 residential facilities of the Mental Health Department of Modena, Italy. In the same department, 5 other residential facilities providing TAU were used as controls. ANOVA for repeated measures showed a significant interaction effect between users' functioning at baseline and follow up assessments, and the intervention. In particular, change in global functioning was higher in the 55 CIGI users than in the 44 TAU users. These results suggest that CIGI can be successfully introduced in residential facilities and may be useful to improve functioning in users with severe mental disorders. PMID:26723137

  2. A non-randomized confirmatory study regarding selection of fertility-sparing surgery for patients with epithelial ovarian cancer: Japan Clinical Oncology Group Study (JCOG1203).

    PubMed

    Satoh, Toyomi; Tsuda, Hitoshi; Kanato, Keisuke; Nakamura, Kenichi; Shibata, Taro; Takano, Masashi; Baba, Tsukasa; Ishikawa, Mitsuya; Ushijima, Kimio; Yaegashi, Nobuo; Yoshikawa, Hiroyuki

    2015-06-01

    Fertility-sparing treatment has been accepted as a standard treatment for epithelial ovarian cancer in stage IA non-clear cell histology grade 1/grade 2. In order to expand an indication of fertility-sparing treatment, we have started a non-randomized confirmatory trial for stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The protocol-defined fertility-sparing surgery is optimal staging laparotomy including unilateral salpingo-oophorectomy, omentectomy, peritoneal cytology and pelvic and para-aortic lymph node dissection or biopsy. After fertility-sparing surgery, four to six cycles of adjuvant chemotherapy with paclitaxel and carboplatin are administered. We plan to enroll 250 patients with an indication of fertility-sparing surgery, and then the primary analysis is to be conducted for 63 operated patients with pathologically confirmed stage IA clear cell histology and stage IC unilateral non-clear cell histology grade 1/grade 2. The primary endpoint is 5-year overall survival. Secondary endpoints are other survival endpoints and factors related to reproduction. This trial has been registered at the UMIN Clinical Trials Registry as UMIN000013380. PMID:26059697

  3. Evaluation of the Efficacy and Safety of Short-Course Deep Sedation Therapy for the Treatment of Intracerebral Hemorrhage After Surgery: A Non-Randomized Control Study.

    PubMed

    Hou, Dapeng; Liu, Beibei; Zhang, Juan; Wang, Qiushi; Zheng, Wei

    2016-01-01

    BACKGROUND While mild and moderate sedation have been widely used to reduce sudden agitation in intracerebral hemorrhage (ICH) patients after surgery, agitation is still a frequent problem, which may cause postoperative blood pressure fluctuation. The present study aimed to evaluate the efficacy and safety of short-course deep sedation for the treatment of ICH after surgery. MATERIAL AND METHODS A total of 41 ICH patients who received surgery, including traditional craniotomy hematoma removal and decompressive craniectomy, were including in this non-randomized control study. Patients in the deep sedation group received continuous postoperative sedation with a target course for ≤12 hours and reached SAS scores of 1~2. Patients in the traditional sedition group received continuous light sedation and reached SAS scores of 3~4. Additional therapeutic interventions included antihypertensive treatment, mechanical ventilation, tracheotomy, and re-operation. RESULTS Patients in the deep sedation group had deeper sedation degree, and lower systolic blood pressure (SBP) and diastolic blood pressure (DBP). Residual hematoma after surgery in patients in the deep sedation group were smaller on the second, seventh, and fourteenth day after surgery (p=0.023, 0.003, 0.004, respectively). The 3-month mortality and quality of life of patients in the deep sedation group were lower and better than that of patients in the traditional sedation group, respectively (p=0.044, p<0.01). No significant difference in the incidence of ventilator-associated pneumonia (VAP) and ICU days were observed between the two groups. CONCLUSIONS Short-course deep sedation therapy in ICH patients after surgery is efficient in controlling postoperative blood pressure, reducing re-bleeding, and improving clinical prognosis. PMID:27466863

  4. Effectiveness of a peer-led HIV prevention intervention in secondary schools in Rwanda: results from a non-randomized controlled trial

    PubMed Central

    2012-01-01

    Background While the HIV epidemic is levelling off in sub-Saharan Africa, it remains at an unacceptably high level. Young people aged 15-24 years remain particularly vulnerable, resulting in a regional HIV prevalence of 1.4% in young men and 3.3% in young women. This study assesses the effectiveness of a peer-led HIV prevention intervention in secondary schools in Rwanda on young people’s sexual behavior, HIV knowledge and attitudes. Methods In a non-randomized longitudinal controlled trial, fourteen schools were selected in two neighboring districts in Rwanda Bugesera (intervention) and Rwamagana (control). Students (n = 1950) in eight intervention and six control schools participated in three surveys (baseline, six and twelve months in the intervention). Analysis was done using linear and logistic regression using generalized estimation equations adjusted for propensity score. Results The overall retention rate was 72%. Time trends in sexual risk behavior (being sexually active, sex in last six months, condom use at last sex) were not significantly different in students from intervention and control schools, nor was the intervention associated with increased knowledge, perceived severity or perceived susceptibility. It did significantly reduce reported stigma. Conclusions Analyzing this and other interventions, we identified several reasons for the observed limited effectiveness of peer education: 1) intervention activities (spreading information) are not tuned to objectives (changing behavior); 2) young people prefer receiving HIV information from other sources than peers; 3) outcome indicators are not adequate and the context of the relationship in which sex occurs and the context in which sex occurs is ignored. Effectiveness of peer education may increase through integration in holistic interventions and redefining peer educators’ role as focal points for sensitization and referral to experts and services. Finally, we argue that a narrow focus on

  5. Rock magnetic evidence of non-random raw material selection criteria in Cerro Toledo Obsidian Artifacts from Valles Caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Gregovich, A.; Feinberg, J. M.; Steffen, A.; Sternberg, R. S.

    2014-12-01

    Stone tools are one of the most enduring forms of ancient human behavior available to anthropologists. The geologic materials that comprise stone tools are a reflection of the rocks that were available locally or through trade, as are the intended use of the tools and the knapping technology needed to produce them. Investigation of the rock magnetic and geochemical characteristics of the artifacts and the geological source materials provides a baseline to explore these past behaviors. This study uses rock magnetic properties to explore the raw material selection criteria involved in the production of obsidian tools in the region around Valles Caldera in northern New Mexico. Obsidian is locally abundant and was traded by tribes across the central United States. Here we compare the rock magnetic properties of a sample of obsidian projectile points (N =25) that have been geochemically sourced to the Cerro Toledo obsidian flow with geological samples collected from four sites within the same flow (N =135). This collection of archaeological artifacts, albeit small, contains representatives of at least 8 different point styles that were used over 6000 years from the Archaic into the Late Prehistoric. Bulk rock hysteresis parameters (Mr, Ms, Bc, and Bcr) and low-field susceptibility (Χ) measurements show that the projectile points generally contain a lower concentration of magnetic minerals than the geologic samples. For example, the artifacts' median Ms value is 2.9 x 10-3 Am2kg-1, while that of the geological samples is 6.5 x 10-3 Am2kg-1. The concentration of magnetic minerals in obsidian is a proxy for the concentration of microlites in general, and this relationship suggests that although obsidian was locally abundant, toolmakers employed non-random selection criteria resulting in generally lower concentrations of microlites in their obsidian tools.

  6. Evaluation of the Efficacy and Safety of Short-Course Deep Sedation Therapy for the Treatment of Intracerebral Hemorrhage After Surgery: A Non-Randomized Control Study

    PubMed Central

    Hou, Dapeng; Liu, Beibei; Zhang, Juan; Wang, Qiushi; Zheng, Wei

    2016-01-01

    Background While mild and moderate sedation have been widely used to reduce sudden agitation in intracerebral hemorrhage (ICH) patients after surgery, agitation is still a frequent problem, which may cause postoperative blood pressure fluctuation. The present study aimed to evaluate the efficacy and safety of short-course deep sedation for the treatment of ICH after surgery. Material/Methods A total of 41 ICH patients who received surgery, including traditional craniotomy hematoma removal and decompressive craniectomy, were including in this non-randomized control study. Patients in the deep sedation group received continuous postoperative sedation with a target course for ≤12 hours and reached SAS scores of 1~2. Patients in the traditional sedition group received continuous light sedation and reached SAS scores of 3~4. Additional therapeutic interventions included antihypertensive treatment, mechanical ventilation, tracheotomy, and re-operation. Results Patients in the deep sedation group had deeper sedation degree, and lower systolic blood pressure (SBP) and diastolic blood pressure (DBP). Residual hematoma after surgery in patients in the deep sedation group were smaller on the second, seventh, and fourteenth day after surgery (p=0.023, 0.003, 0.004, respectively). The 3-month mortality and quality of life of patients in the deep sedation group were lower and better than that of patients in the traditional sedation group, respectively (p=0.044, p<0.01). No significant difference in the incidence of ventilator-associated pneumonia (VAP) and ICU days were observed between the two groups. Conclusions Short-course deep sedation therapy in ICH patients after surgery is efficient in controlling postoperative blood pressure, reducing re-bleeding, and improving clinical prognosis. PMID:27466863

  7. The post-pollination ethylene burst and the continuation of floral advertisement are harbingers of non-random mate selection in Nicotiana attenuata.

    PubMed

    Bhattacharya, Samik; Baldwin, Ian T

    2012-08-01

    The self-compatible plant Nicotiana attenuata grows in genetically diverse populations after fires, and produces flowers that remain open for 3 days and are visited by assorted pollinators. To determine whether and when post-pollination non-random mate selection occurs among self and non-self pollen, seed paternity and semi-in vivo pollen tube growth were determined in controlled single/mixed pollinations. Despite all pollen sources being equally proficient in siring seeds in single-genotype pollinations, self pollen was consistently selected in mixed pollinations, irrespective of maternal genotype. However, clear patterns of mate discrimination occurred amongst non-self pollen when mixed pollinations were performed soon after corollas open, including selection against hygromycin B resistance (transformation selectable marker) in wild-type styles and for it in transformed styles. However, mate choice among pollen genotypes was completely shut down in plants transformed to be unable to produce (irACO) or perceive (ETR1) ethylene. The post-pollination ethylene burst, which originates primarily from the stigma and upper style, was strongly correlated with mate selection in single and mixed hand-pollinations using eight pollen donors in two maternal ecotypes. The post-pollination ethylene burst was also negatively correlated with the continuation of emission of benzylacetone, the most abundant pollinator-attracting corolla-derived floral volatile. We conclude that ethylene signaling plays a pivotal role in mate choice, and the post-pollination ethylene burst and the termination of benzylacetone release are accurate predictors, both qualitatively and quantitatively, of pre-zygotic mate selection and seed paternity. PMID:22458597

  8. The treatment of snoring by radiofrequency-assisted uvulopalatoplasty and results after one-session protocol: a prospective, longitudinal, non-randomized study.

    PubMed

    Chiesa Estomba, Carlos Miguel; Rivera Schmitz, Teresa; Ossa Echeverri, Carla Cristina; Betances Reinoso, Frank Alberto; Fariña Conde, José; Alonso Parraga, Dionisio

    2015-10-01

    Snoring is usually caused by the vibration of walls of the soft palate at the pharyngeal level. Its worldwide prevalence is estimated to range between 2 and 85% depending on age, gender or population group. The aim of this study is to determine the degree of improvement that can be subjectively evident in patients treated by snoring with radiofrequency-assisted uvulopalatoplasty based on a one-session protocol. This is a prospective, longitudinal, non-randomized study. Patients of both sexes, aged 18 years, who attended to the ENT consultation in a tertiary hospital with snoring during the period of July 2012-July 2013 were included. Age, body mass index, Epworth sleepiness scale were calculated. The volume of snoring of each subject was assessed using a visual analog scale. A total of 27 patients were included in the study; the average age of the sample was 49 years (±8.7; min 36/max 74); of these 22 (81.5%) were male and 5 (18.5%) females. The average BMI was 27.07 ± 2.5 (min 23.15/max 29.39) before the test and after 1 year was 26.75 ± 2.32 (min 23.11/max 29.56) with no statistically significant differences in BMI before and after surgery (p = 0.407). Preoperative snoring intensity was 8.10 ± 0.93 according to VAS. We found a statistically significant difference in the post-operative intensity at 3 months of 3.93 ± 0.88 (p ≤ 0.05) at 6 months of 4.41 ± 1.08 (p ≤ 0.05), and after 1 year 4.90 ± 0.77 (p ≤ 0.05). The average rate of ESS was significantly higher preoperatively than post-operative, being 8.76 ± 3.1 preoperative and 6.93 ± 1.68 post-operative (p ≤ 0.05). We conclude that the use of radiofrequency in simple snorers with an apnea/hypopnea index <15 events per hour and a BMI < 30 kg/m(2) in whom clinically proven that the source of snoring is the soft palate, can be treated by one-session protocol, being possible to obtain an improvement of snoring up to 70% of cases by a short follow-up period. PMID:25837987

  9. Non-randomized studies as a source of complementary, sequential or replacement evidence for randomized controlled trials in systematic reviews on the effects of interventions.

    PubMed

    Schünemann, Holger J; Tugwell, Peter; Reeves, Barnaby C; Akl, Elie A; Santesso, Nancy; Spencer, Frederick A; Shea, Beverley; Wells, George; Helfand, Mark

    2013-03-01

    The terms applicability, generalizability, external validity and transferability are related, sometimes used interchangeably and have in common that they lack a clear and consistent definition in the classic epidemiological literature. However, all of these terms generally describe one overarching theme: whether or not available research evidence can be directly utilized to answer the healthcare questions at hand, ideally supported by a judgment about the degree of confidence for this utilization. This concept has been called directness. The objectives of this paper were to delineate how non-randomized studies (NRS) inform judgments in relation to directness and the concepts that it encompasses in the context of systematic reviews. We will briefly review what is known and describe the theoretical and practical issues as well as offer guidance to those tackling the challenges of judging directness and using research evidence to answer healthcare questions with evidence from NRS. In particular, we suggest a framework in which authors can use NRS as a complement, sequence or replacement for randomized controlled trials (RCTs) by focusing on judgments about the population, intervention, comparison and outcomes. Authors of systematic reviews will use NRS to complement judgments about the inconsistencies, the rationale and credibility of subgroup analysis, the baseline risk estimates for the determination of absolute benefits and downsides, and the directness of surrogate outcomes. This evidence includes contextual or supplementary evidence. Authors of systematic review and other summaries of the evidence use NRS as sequential evidence to provide evidence when insufficient evidence is available for an outcome from RCTs, but NRS evidence is available (e.g., long-term harms). Use of evidence from NRS may also serve to replace RCT evidence when NRS provide equivalent (or potentially higher) confidence in the evidence (i.e. quality) compared to indirect evidence from RCTs

  10. Eigenvalues distribution for products of independent spherical ensembles

    NASA Astrophysics Data System (ADS)

    Zeng, Xingyuan

    2016-06-01

    We consider the product of independent spherical ensembles. By the special structure of eigenvalues as a rotation-invariant determinant point process, we show that the empirical spectral distribution of the product converges, with probability one, to a non-random distribution. And the limiting eigenvalue distribution is a power of spherical law. We also present an interesting correspondence between the eigenvalues of three classes of random matrix ensembles and zeros of random polynomials.

  11. Non-Randomized Confirmatory Trial of Laparoscopy-Assisted Total Gastrectomy and Proximal Gastrectomy with Nodal Dissection for Clinical Stage I Gastric Cancer: Japan Clinical Oncology Group Study JCOG1401.

    PubMed

    Kataoka, Kozo; Katai, Hitoshi; Mizusawa, Junki; Katayama, Hiroshi; Nakamura, Kenichi; Morita, Shinji; Yoshikawa, Takaki; Ito, Seiji; Kinoshita, Takahiro; Fukagawa, Takeo; Sasako, Mitsuru

    2016-06-01

    Several prospective studies on laparoscopy-assisted distal gastrectomy for early gastric cancer have been initiated, but no prospective study evaluating laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy has been completed to date. A non-randomized confirmatory trial was commenced in April 2015 to evaluate the safety of laparoscopy-assisted total gastrectomy and laparoscopy-assisted proximal gastrectomy for clinical stage I gastric cancer. A total of 245 patients will be accrued from 42 Japanese institutions over 3 years. The primary endpoint is the proportion of patients with anastomotic leakage. The secondary endpoints are overall survival, relapse-free survival, proportion of patients with completed laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy, proportion of patients with conversion to open surgery, adverse events, and short-term clinical outcomes. The UMIN Clinical Trials Registry number is UMIN000017155. PMID:27433394

  12. Risk of Bias in Systematic Reviews of Non-Randomized Studies of Adverse Cardiovascular Effects of Thiazolidinediones and Cyclooxygenase-2 Inhibitors: Application of a New Cochrane Risk of Bias Tool

    PubMed Central

    Bilandzic, Anja; Fitzpatrick, Tiffany; Rosella, Laura; Henry, David

    2016-01-01

    Background Systematic reviews of the effects of healthcare interventions frequently include non-randomized studies. These are subject to confounding and a range of other biases that are seldom considered in detail when synthesizing and interpreting the results. Our aims were to assess the reliability and usability of a new Cochrane risk of bias (RoB) tool for non-randomized studies of interventions and to determine whether restricting analysis to studies with low or moderate RoB made a material difference to the results of the reviews. Methods and Findings We selected two systematic reviews of population-based, controlled non-randomized studies of the relationship between the use of thiazolidinediones (TZDs) and cyclooxygenase-2 (COX-2) inhibitors and major cardiovascular events. Two epidemiologists applied the Cochrane RoB tool and made assessments across the seven specified domains of bias for each of 37 component studies. Inter-rater agreement was measured using the weighted Kappa statistic. We grouped studies according to overall RoB and performed statistical pooling for (a) all studies and (b) only studies with low or moderate RoB. Kappa scores across the seven bias domains ranged from 0.50 to 1.0. In the COX-2 inhibitor review, two studies had low overall RoB, 14 had moderate RoB, and five had serious RoB. In the TZD review, six studies had low RoB, four had moderate RoB, four had serious RoB, and two had critical RoB. The pooled odds ratios for myocardial infarction, heart failure, and death for rosiglitazone versus pioglitazone remained significantly elevated when analyses were confined to studies with low or moderate RoB. However, the estimate for myocardial infarction declined from 1.14 (95% CI 1.07–1.24) to 1.06 (95% CI 0.99–1.13) when analysis was confined to studies with low RoB. Estimates of pooled relative risks of cardiovascular events with COX-2 inhibitors compared with no nonsteroidal anti-inflammatory drug changed little when analyses were

  13. Non-Randomized Confirmatory Trial of Laparoscopy-Assisted Total Gastrectomy and Proximal Gastrectomy with Nodal Dissection for Clinical Stage I Gastric Cancer: Japan Clinical Oncology Group Study JCOG1401

    PubMed Central

    Kataoka, Kozo; Mizusawa, Junki; Katayama, Hiroshi; Nakamura, Kenichi; Morita, Shinji; Yoshikawa, Takaki; Ito, Seiji; Kinoshita, Takahiro; Fukagawa, Takeo; Sasako, Mitsuru

    2016-01-01

    Several prospective studies on laparoscopy-assisted distal gastrectomy for early gastric cancer have been initiated, but no prospective study evaluating laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy has been completed to date. A non-randomized confirmatory trial was commenced in April 2015 to evaluate the safety of laparoscopy-assisted total gastrectomy and laparoscopy-assisted proximal gastrectomy for clinical stage I gastric cancer. A total of 245 patients will be accrued from 42 Japanese institutions over 3 years. The primary endpoint is the proportion of patients with anastomotic leakage. The secondary endpoints are overall survival, relapse-free survival, proportion of patients with completed laparoscopy-assisted total gastrectomy or laparoscopy-assisted proximal gastrectomy, proportion of patients with conversion to open surgery, adverse events, and short-term clinical outcomes. The UMIN Clinical Trials Registry number is UMIN000017155. PMID:27433394

  14. Short-term intake of a Japanese-style healthy lunch menu contributes to prevention and/or improvement in metabolic syndrome among middle-aged men: a non-randomized controlled trial

    PubMed Central

    2014-01-01

    Background Metabolic syndrome is now widely appreciated as a cluster of metabolic abnormalities such as visceral obesity, hypertension, hyperglycemia and dyslipidemia. To date, incidence of metabolic syndrome is continuously increasing worldwide. In addition, low vegetable consumption has recently become a serious issue in Japan. Furthermore, Japan is facing a shortfall in places offering food that can help prevent metabolic syndrome in the first place. Our study is designed to influence these developments. We conducted a non-randomized controlled trial by offering a Japanese-style healthy lunch menu to middle-aged men in a workplace cafeteria. This menu was designed to prevent and reduce metabolic syndrome. Methods This intervention study took the form of a non-randomized controlled trial. Participants chose the control or intervention group. The control group consumed their habitual lunches without restriction and only nutrient contents were assessed. The intervention group received a Japanese-style healthy lunch at a workplace cafeteria for 3 months. The participants worked in offices at a city hall and mostly had low levels of physical activity. Data of 35 males (control group: 7 males, intervention group: 28 males, mean age: 47.2 ± 7.9 years) were collected and analyzed. Results We obtained an effective outcome by demonstrating that ongoing intake of a Japanese-style healthy lunch decreased blood pressure and serum lipids and increased plasma ghrelin levels. The results grew more pronounced as intake of Japanese-style healthy lunches increased in frequency. Conclusions This study presents new empirical data as a result of an original intervention program undertaken in Japan. A Japanese-style healthy lunch menu containing many vegetables consumed can help prevent and/or improve metabolic syndrome. PMID:24673894

  15. Non-random distribution of methyl-CpG sites and non-CpG methylation in the human rDNA promoter identified by next generation bisulfite sequencing.

    PubMed

    Pietrzak, Maciej; Rempala, Grzegorz A; Nelson, Peter T; Hetman, Michal

    2016-07-01

    A next generation bisulfite sequencing (NGBS) was used to study rDNA promoter methylation in human brain using postmortem samples of the parietal cortex. Qualitative analysis of patterns of CpG methylation was performed at the individual rDNA unit level. CpG site-specific differences in methylation frequency were observed with the core promoter harboring three out of four most methylated CpGs. Moreover, there was an overall trend towards co-methylation for all possible pairs of 26 CpG sites. The hypermethylated CpGs from the core promoter were also most likely to be co-methylated. Finally, although rare, non-CpG (CpH) methylation was detected at several sites with one of them confirmed using the PspGI-qPCR assay. Similar trends were observed in samples from control individuals as well as patients suffering of Alzheimer's disease (AD), mild cognitive impairment (MCI) or ataxia telangiectasia (AT). Taken together, while some methyl-CpG sites including those in the core promoter may have relatively greater inhibitory effect on rRNA transcription, co-methylation at multiple sites may be required for full and/or long lasting silencing of human rDNA. PMID:27008990

  16. New Statistical Results on the Angular Distribution of Gamma-Ray Bursts

    SciTech Connect

    Balazs, Lajos G.; Horvath, Istvan; Vavrek, Roland

    2008-05-22

    We presented the results of several statistical tests of the randomness in the angular sky-distribution of gamma-ray bursts in BATSE Catalog. Thirteen different tests were presented based on Voronoi tesselation, Minimal spanning tree and Multifractal spectrum for five classes (short1, short2, intermediate, long1, long2) of gamma-ray bursts, separately. The long1 and long2 classes are distributed randomly. The intermediate subclass, in accordance with the earlier results of the authors, is distributed non-randomly. Concerning the short subclass earlier statistical tests also suggested some departure from the random distribution, but not on a high enough confidence level. The new tests presented in this article suggest also non-randomness here.

  17. Non-random sharing of Plantae genes.

    PubMed

    Chan, Cheong Xin; Bhattacharya, Debashish

    2011-05-01

    The power of eukaryote genomics relies strongly on taxon sampling. This point was underlined in a recent analysis of red algal genome evolution in which we tested the Plantae hypothesis that posits the monophyly of red, green (including plants) and glaucophyte algae. The inclusion of novel genome data from two mesophilic red algae enabled us to robustly demonstrate the sisterhood of red and green algae in the tree of life. Perhaps more exciting was the finding that >1,800 putative genes in the unicellular red alga Porphyridium cruentum showed evidence of gene-sharing with diverse lineages of eukaryotes and prokaryotes. Here we assessed the correlation between the putative functions of these shared genes and their susceptibility to transfer. It turns out that genes involved in complex interactive networks such as biological regulation and transcription/translation are less susceptible to endosymbiotic or horizontal gene transfer, when compared to genes with metabolic and transporter functions. PMID:21980581

  18. Interventions for physical activity promotion applied to the primary healthcare settings for people living in regions of low socioeconomic level: study protocol for a non-randomized controlled trial

    PubMed Central

    2014-01-01

    Background Regular physical activity practice has been widely recommended for promoting health, but the physical activity levels remain low in the population. Therefore, the study of interventions to promote physical activity is essential. Objective: To present the methodology of two physical activity interventions from the “Ambiente Ativo” (“Active Environment”) project. Methods 12-month non-randomized controlled intervention trial. 157 healthy and physically inactive individuals were selected: health education (n = 54) supervised exercise (n = 54) and control (n = 49). Intervention based on health education: a multidisciplinary team of health professionals organized the intervention in group discussions, phone calls, SMS and educational material. Intervention based on supervised exercise program: consisted of offering an exercise program in groups supervised by physical education professionals involving strength, endurance and flexibility exercises. The physical activity level was assessed by the International Physical Activity Questionnaire (long version), physical activities recalls, pedometers and accelerometers over a seven-day period. Result This study described two different proposals for promoting physical activity that were applied to adults attended through the public healthcare settings. The participants were living in a region of low socioeconomic level, while respecting the characteristics and organization of the system and its professionals, and also adapting the interventions to the realities of the individuals attended. Conclusion Both interventions are applicable in regions of low socioeconomic level, while respecting the social and economic characteristics of each region. Trial registration ClinicalTrials.gov NCT01852981 PMID:24624930

  19. Effectiveness of Kenya's Community Health Strategy in delivering community-based maternal and newborn health care in Busia County, Kenya: non-randomized pre-test post test study

    PubMed Central

    Wangalwa, Gilbert; Cudjoe, Bennett; Wamalwa, David; Machira, Yvonne; Ofware, Peter; Ndirangu, Meshack; Ilako, Festus

    2012-01-01

    Background Maternal mortality ratio and neonatal mortality rate trends in Kenya have remained unacceptably high in a decade. In 2007, the Ministry of Public Health and Sanitation adopted a community health strategy to reverse the poor health outcomes in order to meet Millennium Development Goals 4 and 5. It aims at strengthening community participation and its ability to take action towards health. The study aimed at evaluating the effectiveness of the strategy in improving maternal and neonatal health outcomes in Kenya. Methods Between 2008 and 2010, the African Medical and Research Foundation implemented a community-based maternal and newborn care intervention package in Busia County using the community health strategy approach. An interventional, non-randomized pre-test post test study design was used to evaluate change in essential maternal and neonatal care practices among mothers with children aged 0 - 23 months. Results There was statistically significant (p < 0.05) increase in attendance of at least four antenatal care visits (39% to 62%), deliveries by skilled birth attendants (31% to 57%), receiving intermittent preventive treatment (23% to 57%), testing for HIV during pregnancy (73% to 90%) and exclusive breastfeeding (20% to 52%). Conclusion The significant increase in essential maternal and neonatal care practices demonstrates that, community health strategy is an appropriate platform to deliver community based interventions. The findings will be used by actors in the child survival community to improve current approaches, policies and practice in maternal and neonatal care. PMID:23467438

  20. RANDOMNESS of Numbers DEFINITION(QUERY:WHAT? V HOW?) ONLY Via MAXWELL-BOLTZMANN CLASSICAL-Statistics(MBCS) Hot-Plasma VS. Digits-Clumping Log-Law NON-Randomness Inversion ONLY BOSE-EINSTEIN QUANTUM-Statistics(BEQS) .

    NASA Astrophysics Data System (ADS)

    Siegel, Z.; Siegel, Edward Carl-Ludwig

    2011-03-01

    RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!

  1. Impact of a Multifaceted and Clinically Integrated Training Program in Evidence-Based Practice on Knowledge, Skills, Beliefs and Behaviour among Clinical Instructors in Physiotherapy: A Non-Randomized Controlled Study

    PubMed Central

    Olsen, Nina Rydland; Bradley, Peter; Espehaug, Birgitte; Nortvedt, Monica Wammen; Lygren, Hildegunn; Frisk, Bente; Bjordal, Jan Magnus

    2015-01-01

    Background and Purpose Physiotherapists practicing at clinical placement sites assigned the role as clinical instructors (CIs), are responsible for supervising physiotherapy students. For CIs to role model evidence-based practice (EBP) they need EBP competence. The aim of this study was to assess the short and long term impact of a six-month multifaceted and clinically integrated training program in EBP on the knowledge, skills, beliefs and behaviour of CIs supervising physiotherapy students. Methods We invited 37 CIs to participate in this non-randomized controlled study. Three self-administered questionnaires were used pre- and post-intervention, and at six-month follow-up: 1) The Adapted Fresno test (AFT), 2) the EBP Belief Scale and 3) the EBP Implementation Scale. The analysis approach was linear regression modeling using Generalized Estimating Equations. Results In total, 29 CIs agreed to participate in the study: 14 were invited to participate in the intervention group and 15 were invited to participate in the control group. One in the intervention group and five in the control group were lost to follow-up. At follow-up, the group difference was statistically significant for the AFT (mean difference = 37, 95% CI (15.9 -58.1), p<0.001) and the EBP Beliefs scale (mean difference = 8.1, 95% CI (3.1 -13.2), p = 0.002), but not for the EBP Implementation scale (mean difference = 1.8. 95% CI (-4.5-8.1), p = 0.574). Comparing measurements over time, we found a statistically significant increase in mean scores related to all outcome measures for the intervention group only. Conclusions A multifaceted and clinically integrated training program in EBP was successful in improving EBP knowledge, skills and beliefs among CIs. Future studies need to ensure long-term EBP behaviour change, in addition to assessing CIs’ abilities to apply EBP knowledge and skills when supervising students. PMID:25894559

  2. Does school-based physical activity decrease overweight and obesity in children aged 6–9 years? A two-year non-randomized longitudinal intervention study in the Czech Republic

    PubMed Central

    2012-01-01

    Background Globally, efforts aimed at the prevention of childhood obesity have led to the implementation of a range of school-based interventions. This study assessed whether augmenting physical activity (PA) within the school setting resulted in increased daily PA and decreased overweight/obesity levels in 6-9-year-old children. Methods Across the first to third primary school years, PA of 84 girls and 92 boys was objectively monitored five times (each for seven successive days) using Yamax pedometer (step counts) and Caltrac accelerometer (activity energy expenditure AEE - kcal/kg per day). Four schools were selected to participate in the research (2 intervention, 2 controls), comprising intervention (43 girls, 45 boys) and control children (41 girls, 47 boys). The study was non-randomized and the intervention schools were selected on the basis of existing PA-conducive environment. Analyses of variance (ANOVA) for repeated measures examined the PA programme and gender effects on the step counts and AEE. Logistic regression (Enter method) determined the obesity and overweight occurrence prospect over the course of implementation of the PA intervention. Results There was a significant increase of school-based PA during schooldays in intervention children (from ≈ 1718 to ≈ 3247 steps per day; and from 2.1 to ≈ 3.6 Kcal/Kg per day) in comparison with the control children. Increased school-based PA of intervention children during schooldays contributed to them achieving >10,500 steps and >10.5 Kcal/Kg per school day across the 2 years of the study, and resulted in a stop of the decline in PA levels that is known to be associated with the increasing age of children. Increased school-based PA had also positive impact on leisure time PA of schooldays and on PA at weekends of intervention children. One year after the start of the PA intervention, the odds of being overweight or obese in the intervention children was almost three times lower than that of

  3. Analysis of the irregular planar distribution of proteins in membranes.

    PubMed

    Hui, S W; Frank, J

    1985-03-01

    Methods to characterize the irregular but non-random planar distribution of proteins in biological membranes were investigated. The distribution of the proteins constituting the intramembranous particles (IMP) in human erythrocyte membranes was used as an example. The distribution of IMPs was deliberately altered by experimental means. For real space analyses, the IMP positions in freeze fracture micrograph S were determined by an automatic procedure described. Radial distribution and autocorrelation analysis revealed quantitative differences between experimental groups. These methods are more sensitive than the corresponding optical diffraction or Fourier-Bessel analyses of the same IMP distribution data, due to the inability of the diffraction methods to separate contrast and distribution effects. A method to identify IMPs on a non-uniform background is described. PMID:3999133

  4. Evaluating the Effectiveness of an Antimicrobial Stewardship Program on Reducing the Incidence Rate of Healthcare-Associated Clostridium difficile Infection: A Non-Randomized, Stepped Wedge, Single-Site, Observational Study

    PubMed Central

    McArthur, Leslie

    2016-01-01

    Background The incidence rate of healthcare-associated Clostridium difficile infection (HA-CDI) is estimated at 1 in 100 patients. Antibiotic exposure is the most consistently reported risk factor for HA-CDI. Strategies to reduce the risk of HA-CDI have focused on reducing antibiotic utilization. Prospective audit and feedback is a commonly used antimicrobial stewardship intervention (ASi). The impact of this ASi on risk of HA-CDI is equivocal. This study examines the effectiveness of a prospective audit and feedback ASi on reducing the risk of HA-CDI. Methods Single-site, 339 bed community-hospital in Barrie, Ontario, Canada. Primary outcome is HA-CDI incidence rate. Daily prospective and audit ASi is the exposure variable. ASi implemented across 6 wards in a non-randomized, stepped wedge design. Criteria for ASi; any intravenous antibiotic use for ≥ 48 hrs, any oral fluoroquinolone or oral second generation cephalosporin use for ≥ 48 hrs, or any antimicrobial use for ≥ 5 days. HA-CDI cases and model covariates were aggregated by ward, year and month starting September 2008 and ending February 2016. Multi-level mixed effect negative binomial regression analysis was used to model the primary outcome, with intercept and slope coefficients for ward-level random effects estimated. Other covariates tested for inclusion in the final model were derived from previously published risk factors. Deviance residuals were used to assess the model’s goodness-of-fit. Findings The dataset included 486 observation periods, of which 350 were control periods and 136 were intervention periods. After accounting for all other model covariates, the estimated overall ASi incidence rate ratio (IRR) was 0.48 (95% 0.30, 0.79). The ASi effect was independent of antimicrobial utilization. The ASi did not seem to reduce the risk of Clostridium difficile infection on the surgery wards (IRR 0.87, 95% CI 0.45, 1.69) compared to the medicine wards (IRR 0.42, 95% CI 0.28, 0.63). The ward

  5. On The Distribution Of Angular Orbital Elements Of Near-earth Objects

    NASA Astrophysics Data System (ADS)

    JeongAhn, Youngmin; Malhotra, R.

    2012-05-01

    The longitude of ascending node Ω and the argument of periapsis ω are expected to be randomly distributed for near-Earth objects (NEOs). However, the distribution of these angles for the Apollo, Amor and Aten subclasses, considered separately, shows some striking non-random features. We explain how these features arise due to observational biases. The distribution of Ω has maxima near 0 and 180° and is affected by observational difficulty due to the galactic plane at the opposition and other seasonal effects. The ω distributions of Aten and Amor subclasses have minima at 90° and 270° while Apollos have minima at 0 and 180°. This is explained by the greater detectability of NEOs at close approach to Earth. The longitude of perihelion Ω+ω also has a strongly non-random distribution that may be owed to actual dynamical effects. Understanding the distribution of unobserved NEOs will help to improve planning for the next generation of NEO surveys. A better understanding of the intrinsic distribution of NEOs is important for estimating the impact hazard at Earth; it is also important for understanding the impact history of the Moon and the terrestrial planets.

  6. Distributed processing; distributed functions?

    PubMed Central

    Fox, Peter T.; Friston, Karl J.

    2016-01-01

    After more than twenty years busily mapping the human brain, what have we learned from neuroimaging? This review (coda) considers this question from the point of view of structure–function relationships and the two cornerstones of functional neuroimaging; functional segregation and integration. Despite remarkable advances and insights into the brain’s functional architecture, the earliest and simplest challenge in human brain mapping remains unresolved: We do not have a principled way to map brain function onto its structure in a way that speaks directly to cognitive neuroscience. Having said this, there are distinct clues about how this might be done: First, there is a growing appreciation of the role of functional integration in the distributed nature of neuronal processing. Second, there is an emerging interest in data-driven cognitive ontologies, i.e., that are internally consistent with functional anatomy. We will focus this review on the growing momentum in the fields of functional connectivity and distributed brain responses and consider this in the light of meta-analyses that use very large data sets to disclose large-scale structure–function mappings in the human brain. PMID:22245638

  7. Fish depth distributions in the Lower Mississippi River

    USGS Publications Warehouse

    Killgore, K. J.; Miranda, Leandro E.

    2014-01-01

    A substantial body of literature exists about depth distribution of fish in oceans, lakes and reservoirs, but less is known about fish depth distribution in large rivers. Most of the emphasis on fish distributions in rivers has focused on longitudinal and latitudinal spatial distributions. Knowledge on depth distribution is necessary to understand species and community habitat needs. Considering this void, our goal was to identify patterns in fish benthic distribution along depth gradients in the Lower Mississippi River. Fish were collected over 14 years in depths down to 27 m. Fish exhibited non-random depth distributions that varied seasonally and according to species. Species richness was highest in shallow water, with about 50% of the 62 species detected no longer collected in water deeper than 8 m and about 75% no longer collected in water deeper than 12 m. Although richness was highest in shallow water, most species were not restricted to shallow water. Rather, most species used a wide range of depths. A weak depth zonation occurred, not as strong as that reported for deep oceans and lakes. Larger fish tended to occur in deeper water during the high-water period of an annual cycle, but no correlation was evident during the low-water period. The advent of landscape ecology has guided river research to search for spatial patterns along the length of the river and associated floodplains. Our results suggest that fish assemblages in large rivers are also structured vertically. 

  8. On Authentication with HMAC and Non-random Properties

    NASA Astrophysics Data System (ADS)

    Rechberger, Christian; Rijmen, Vincent

    MAC algorithms can provide cryptographically secure authentication services. One of the most popular algorithms in commercial applications is HMAC based on the hash functions MD5 or SHA-1. In the light of new collision search methods for members of the MD4 family including SHA-1, the security of HMAC based on these hash functions is reconsidered.

  9. The Use of Control in Non-Randomized Designs.

    ERIC Educational Resources Information Center

    Halperin, Si; Jorgensen, Randall

    The concept of control is fundamental to comparative research. In research designs where randomization of observational units is not possible, control has been exercised statistically from a single covariate by a process of residualization. The alternative, known as subclassification on the propensity score, was developed primarily for…

  10. From non-random molecular structure to life and mind

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1989-01-01

    The evolutionary hierarchy molecular structure-->macromolecular structure-->protobiological structure-->biological structure-->biological functions has been traced by experiments. The sequence always moves through protein. Extension of the experiments traces the formation of nucleic acids instructed by proteins. The proteins themselves were, in this picture, instructed by the self-sequencing of precursor amino acids. While the sequence indicated explains the thread of the emergence of life, protein in cellular membrane also provides the only known material basis for the emergence of mind in the context of emergence of life.

  11. Evaluating Non-Randomized Educational Interventions: A Graphical Discussion

    ERIC Educational Resources Information Center

    Theobald, Roddy; Richardson, Thomas

    2014-01-01

    A central goal of the education literature is to demonstrate that specific educational interventions--instructional interventions at the student or classroom level, structural interventions at the school level, or funding interventions at the school district level, for example--have a "treatment effect" on student achievement. This paper…

  12. Latitudinal skewness in global EQ distributions

    NASA Astrophysics Data System (ADS)

    Sasorova, E.; Levin, B.

    2009-04-01

    increases step-by-step for middle latitudes. And the essential part of EQ sources in latitudinal belts near equator (30 deg S -30 deg N) are located on the depths 100=500 km (deep EQ). In this case clear expressed peak of deep EQ belongs on latitudinal belt 30-20 deg S In frame of the second project [Levin, Sasorova, 2005; Sasorova, Levin, Emelyanova, 2006] statistical verification of hypothesis about existence non-random component in time distribution of the EQs between the Northern and Southern part of the Pacific was carried out. The distribution-free test (run test with significant level 1%) was used for existence proof of nonrandom component into time sequences. The time sequences of the EQ switching between the Northern and Southern parts of the Pacific region contain statistically significant nonrandom component for the events with 4.0<=Mb<6.0. Then data in each magnitude range was subdivided in two groups: the deep earthquakes (H>70km) and the shallow earthquakes (H<=70km). It was found that nonrandom component does not exist for deep earthquakes. On the contrary it is clearly manifested in time distribution of the shallow events. The digital model (superposition of random and periodic processes) was proposed. It was shown that the manifestation of non-random component depends on magnitude of studied events and on the duration of the observation period. The stability of distributions from the observation interval size was demonstrated on the data independence analysis for several eight -year intervals.

  13. Class Size, Class Composition, and the Distribution of Student Achievement

    ERIC Educational Resources Information Center

    Bosworth, Ryan

    2014-01-01

    Using richly detailed data on fourth- and fifth-grade students in the North Carolina public school system, I find evidence that students are assigned to classrooms in a non-random manner based on observable characteristics for a substantial portion of classrooms. Moreover, I find that this non-random assignment is statistically related to class…

  14. Distributed computing

    SciTech Connect

    Chambers, F.B.; Duce, D.A.; Jones, G.P.

    1984-01-01

    CONTENTS: The Dataflow Approach: Fundamentals of dataflow. Architecture and performance. Assembler level programming. High level dataflow programming. Declarative systems: Functional programming. Logic programming and prolog. The ''language first'' approach. Towards a successor to von Neumann. Loosely-coupled systems: Architectures. Communications. Distributed filestores. Mechanisms for distributed control. Distributed operating systems. Programming languages. Closely-coupled systems: Architecture. Programming languages. Run-time support. Development aids. Cyba-M. Polyproc. Modeling and verification: Using algebra for concurrency. Reasoning about concurrent systems. Each chapter includes references. Index.

  15. Distributed Intelligence.

    ERIC Educational Resources Information Center

    McLagan, Patricia A.

    2003-01-01

    Distributed intelligence occurs when people in an organization take responsibility for creating innovations, solving problems, and making decisions. Organizations that have it excel in their markets and the global environment. (Author/JOW)

  16. Genomic distribution of simple sequence repeats in Brassica rapa.

    PubMed

    Hong, Chang Pyo; Piao, Zhong Yun; Kang, Tae Wook; Batley, Jacqueline; Yang, Tae-Jin; Hur, Yoon-Kang; Bhak, Jong; Park, Beom-Seok; Edwards, David; Lim, Yong Pyo

    2007-06-30

    Simple Sequence Repeats (SSRs) represent short tandem duplications found within all eukaryotic organisms. To examine the distribution of SSRs in the genome of Brassica rapa ssp. pekinensis, SSRs from different genomic regions representing 17.7 Mb of genomic sequence were surveyed. SSRs appear more abundant in non-coding regions (86.6%) than in coding regions (13.4%). Comparison of SSR densities in different genomic regions demonstrated that SSR density was greatest within the 5'-flanking regions of the predicted genes. The proportion of different repeat motifs varied between genomic regions, with trinucleotide SSRs more prevalent in predicted coding regions, reflecting the codon structure in these regions. SSRs were also preferentially associated with gene-rich regions, with peri-centromeric heterochromatin SSRs mostly associated with retrotransposons. These results indicate that the distribution of SSRs in the genome is non-random. Comparison of SSR abundance between B. rapa and the closely related species Arabidopsis thaliana suggests a greater abundance of SSRs in B. rapa, which may be due to the proposed genome triplication. Our results provide a comprehensive view of SSR genomic distribution and evolution in Brassica for comparison with the sequenced genomes of A. thaliana and Oryza sativa. PMID:17646709

  17. Distributed Leadership.

    ERIC Educational Resources Information Center

    Lashway, Larry

    2003-01-01

    School-reform efforts in recent years have stressed, and expanded, the leadership role of the principal. But in the view of many analysts, the task of transforming a school is too complex for one person to accomplish alone. Consequently, a new model of leadership is developing: distributed leadership. This Research Roundup summarizes five…

  18. Distributive Justice and Distributive Quality.

    ERIC Educational Resources Information Center

    Wexler, Jacqueline Grennan

    American higher education in this century has been almost schizophrenic in its development. As money and knowledge began to spread more distributively across the population, the population began to demand for its children a more equitable access into the world of the more privileged. Education and privilege were highly correlated. Greater access…

  19. Distributed Optimization

    NASA Technical Reports Server (NTRS)

    Macready, William; Wolpert, David

    2005-01-01

    We demonstrate a new framework for analyzing and controlling distributed systems, by solving constrained optimization problems with an algorithm based on that framework. The framework is ar. information-theoretic extension of conventional full-rationality game theory to allow bounded rational agents. The associated optimization algorithm is a game in which agents control the variables of the optimization problem. They do this by jointly minimizing a Lagrangian of (the probability distribution of) their joint state. The updating of the Lagrange parameters in that Lagrangian is a form of automated annealing, one that focuses the multi-agent system on the optimal pure strategy. We present computer experiments for the k-sat constraint satisfaction problem and for unconstrained minimization of NK functions.

  20. Distributed Saturation

    NASA Technical Reports Server (NTRS)

    Chung, Ming-Ying; Ciardo, Gianfranco; Siminiceanu, Radu I.

    2007-01-01

    The Saturation algorithm for symbolic state-space generation, has been a recent break-through in the exhaustive veri cation of complex systems, in particular globally-asyn- chronous/locally-synchronous systems. The algorithm uses a very compact Multiway Decision Diagram (MDD) encoding for states and the fastest symbolic exploration algo- rithm to date. The distributed version of Saturation uses the overall memory available on a network of workstations (NOW) to efficiently spread the memory load during the highly irregular exploration. A crucial factor in limiting the memory consumption during the symbolic state-space generation is the ability to perform garbage collection to free up the memory occupied by dead nodes. However, garbage collection over a NOW requires a nontrivial communication overhead. In addition, operation cache policies become critical while analyzing large-scale systems using the symbolic approach. In this technical report, we develop a garbage collection scheme and several operation cache policies to help on solving extremely complex systems. Experiments show that our schemes improve the performance of the original distributed implementation, SmArTNow, in terms of time and memory efficiency.

  1. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    PubMed

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints. PMID:25806601

  2. Fish species and community distributions as proxies for seafloor habitat distributions: The Stellwagen Bank National Marine Sanctuary example (Northwest Atlantic, Gulf of Maine)

    USGS Publications Warehouse

    Auster, P.J.; Joy, K.; Valentine, P.C.

    2001-01-01

    Defining the habitats of fishes and associated fauna on outer continental shelves is problematic given the paucity of data on the actual types and distributions of seafloor habitats. However many regions have good data on the distributions of fishes from resource surveys or catch statistics because of the economic importance of the fisheries. Fish distribution data (species or communities) have been used as a proxy for the distribution of habitats to develop precautionary conservation strategies for habitat protection (e.g., marine protected areas, fishing gear restrictions). In this study we assessed the relationships between the distributions of fish communities and species derived from trawl survey data with the spatial distribution of sediment types determined by sampling and acoustic reflectance derived from multibeam sonar surveys in Stellwagen Bank National Marine Sanctuary. Fish communities were correlated with reflectance values but all communities did not occur in unique sediment types. This suggests that use of community distributions as proxies for habitats should include the caveat that a greater number of communities within an area could indicate a greater range of habitat types. Single species distributions showed relationships between abundance and reflectance values. Trawl catches with low abundances had wide variations in reflectance values while those with high abundances had narrower ranges indicating habitat affinities. Significant non-random frequency-dependent relationships were observed for 17 of 20 species although only 12 of 20 species had significant relationships based on rank correlation. These results suggest that species distributions based on trawl survey data can be used as proxies for the distribution of seafloor habitats. Species with known habitat associations can be used to infer habitat requirements of co-occurring species and can be used to identify a range of habitat types.

  3. Fish species and community distributions as proxies for sea-floor habitat distributions: the Stellwagen Bank National Marine Sanctuary example (northwest Atlantic, Gulf Of Maine)

    USGS Publications Warehouse

    Auster, Peter J.; Joy, Kevin; Valentine, Page C.

    2001-01-01

    Defining the habitats of fishes and associated fauna on outer continental shelves is problematic given the paucity of data on the actual types and distributions of seafloor habitats. However many regions have good data on the distributions of fishes from resource surveys or catch statistics because of the economic importance of the fisheries. Fish distribution data (species or communities) have been used as a proxy for the distribution of habitats to develop precautionary conservation strategies for habitat protection (e.g., marine protected areas, fishing gear restrictions). In this study we assessed the relationships between the distributions of fish communities and species derived from trawl survey data with the spatial distribution of sediment types determined by sampling and acoustic reflectance derived from multibeam sonar surveys in Stellwagen Bank National Marine Sanctuary. Fish communities were correlated with reflectance values but all communities did not occur in unique sediment types. This suggests that use of community distributions as proxies for habitats should include the caveat that a greater number of communities within an area could indicate a greater range of habitat types. Single species distributions showed relationships between abundance and reflectance values. Trawl catches with low abundances had wide variations in reflectance values while those with high abundances had narrower ranges indicating habitat affinities. Significant non-random frequency-dependent relationships were observed for 17 of 20 species although only 12 of 20 species had significant relationships based on rank correlation. These results suggest that species distributions based on trawl survey data can be used as proxies for the distribution of seafloor habitats. Species with known habitat associations can be used to infer habitat requirements of co-occurring species and can be used to identify a range of habitat types.

  4. Distributed visualization

    SciTech Connect

    Arnold, T.R.

    1991-12-31

    Within the last half decade or so, two technological evolutions have culminated in mature products of potentially great utility to computer simulation. One is the emergence of low-cost workstations with versatile graphics and substantial local CPU power. The other is the adoption of UNIX as a de facto ``standard`` operating system on at least some machines offered by virtually all vendors. It is now possible to perform transient simulations in which the number- crunching capability of a supercomputer is harnessed to allow both process control and graphical visualization on a workstation. Such a distributed computer system is described as it now exists: a large FORTRAN application on a CRAY communicates with the balance of the simulation on a SUN-3 or SUN-4 via remote procedure call (RPC) protocol. The hooks to the application and the graphics have been made very flexible. Piping of output from the CRAY to the SUN is nonselective, allowing the user to summon data and draw or plot at will. The ensemble of control, application, data handling, and graphics modules is loosely coupled, which further generalizes the utility of the software design.

  5. Distributed visualization

    SciTech Connect

    Arnold, T.R.

    1991-01-01

    Within the last half decade or so, two technological evolutions have culminated in mature products of potentially great utility to computer simulation. One is the emergence of low-cost workstations with versatile graphics and substantial local CPU power. The other is the adoption of UNIX as a de facto standard'' operating system on at least some machines offered by virtually all vendors. It is now possible to perform transient simulations in which the number- crunching capability of a supercomputer is harnessed to allow both process control and graphical visualization on a workstation. Such a distributed computer system is described as it now exists: a large FORTRAN application on a CRAY communicates with the balance of the simulation on a SUN-3 or SUN-4 via remote procedure call (RPC) protocol. The hooks to the application and the graphics have been made very flexible. Piping of output from the CRAY to the SUN is nonselective, allowing the user to summon data and draw or plot at will. The ensemble of control, application, data handling, and graphics modules is loosely coupled, which further generalizes the utility of the software design.

  6. Progress in characterizing submonolayer island growth: Capture-zone distributions, growth exponents, & hot precursors

    NASA Astrophysics Data System (ADS)

    Einstein, Theodore L.; Pimpinelli, Alberto; González, Diego Luis; Morales-Cifuentes, Josue R.

    2015-09-01

    In studies of epitaxial growth, analysis of the distribution of the areas of capture zones (i.e. proximity polygons or Voronoi tessellations with respect to island centers) is often the best way to extract the critical nucleus size i. For non-random nucleation the normalized areas s of these Voronoi cells are well described by the generalized Wigner distribution (GWD) Pβ(s) = asβ exp(-bs2), particularly in the central region 0.5 < s < 2 where data are least noisy. Extensive Monte Carlo simulations reveal inadequacies of our earlier mean field analysis, suggesting β = i + 2 for diffusion-limited aggregation (DLA). Since simulations generate orders of magnitude more data than experiments, they permit close examination of the tails of the distribution, which differ from the simple GWD form. One refinement is based on a fragmentation model. We also compare island-size distributions. We compare analysis by island-size distribution and by scaling of island density with flux. Modifications appear for attach-limited aggregation (ALA). We focus on the experimental system para-hexaphenyl on amorphous mica, comparing the results of the three analysis techniques and reconciling their results via a novel model of hot precursors based on rate equations, pointing out the existence of intermediate scaling regimes between DLA and ALA.

  7. Alpine bird distributions along elevation gradients: the consistency of climate and habitat effects across geographic regions.

    PubMed

    Chamberlain, Dan; Brambilla, Mattia; Caprio, Enrico; Pedrini, Paolo; Rolando, Antonio

    2016-08-01

    Many species have shown recent shifts in their distributions in response to climate change. Patterns in species occurrence or abundance along altitudinal gradients often serve as the basis for detecting such changes and assessing future sensitivity. Quantifying the distribution of species along altitudinal gradients acts as a fundamental basis for future studies on environmental change impacts, but in order for models of altitudinal distribution to have wide applicability, it is necessary to know the extent to which altitudinal trends in occurrence are consistent across geographically separated areas. This was assessed by fitting models of bird species occurrence across altitudinal gradients in relation to habitat and climate variables in two geographically separated alpine regions, Piedmont and Trentino. The ten species studied showed non-random altitudinal distributions which in most cases were consistent across regions in terms of pattern. Trends in relation to altitude and differences between regions could be explained mostly by habitat or a combination of habitat and climate variables. Variation partitioning showed that most variation explained by the models was attributable to habitat, or habitat and climate together, rather than climate alone or geographic region. The shape and position of the altitudinal distribution curve is important as it can be related to vulnerability where the available space is limited, i.e. where mountains are not of sufficient altitude for expansion. This study therefore suggests that incorporating habitat and climate variables should be sufficient to construct models with high transferability for many alpine species. PMID:27139426

  8. Predictors of the distribution and abundance of a tube sponge and its resident goby

    NASA Astrophysics Data System (ADS)

    D'Aloia, C. C.; Majoris, J. E.; Buston, P. M.

    2011-09-01

    Microhabitat specialists offer tractable systems for studying the role of habitat in determining species' distribution and abundance patterns. While factors underlying the distribution patterns of these specialists have been studied for decades, few papers have considered factors influencing both the microhabitat and the inhabitant. On the Belizean barrier reef, the obligate sponge-dwelling goby Elacatinus lori inhabits the yellow tube sponge Aplysina fistularis. We used field data and multivariate analyses to simultaneously consider factors influencing sponge and goby distributions. Sponges were non-randomly distributed across the reef with density peaking at a depth of 10-20 m. Sponge morphology also varied with depth: sponges tended to be larger and have fewer tubes with increasing depth. Knowing these patterns of sponge distribution and morphology, we considered how they influenced the distribution of two categories of gobies: residents (≥18 mm SL) and settlers (<18 mm SL). Maximum tube length, number of sponge tubes, and depth were significant predictors of resident distribution. Residents were most abundant in large sponges with multiple tubes, and were virtually absent from sponges shallower than 10 m. Similarly, maximum tube length and number of sponge tubes were significant predictors of settler distribution, with settlers most abundant in large sponges with multiple tubes. The presence or absence of residents in a sponge was not a significant predictor of settler distribution. These results provide us with a clear understanding of where sponges and gobies are found on the reef and support the hypothesis that microhabitat characteristics are good predictors of fish abundance for species that are tightly linked to microhabitat.

  9. On S.N. Bernstein's derivation of Mendel's Law and 'rediscovery' of the Hardy-Weinberg distribution.

    PubMed

    Stark, Alan; Seneta, Eugene

    2012-04-01

    Around 1923 the soon-to-be famous Soviet mathematician and probabilist Sergei N. Bernstein started to construct an axiomatic foundation of a theory of heredity. He began from the premise of stationarity (constancy of type proportions) from the first generation of offspring. This led him to derive the Mendelian coefficients of heredity. It appears that he had no direct influence on the subsequent development of population genetics. A basic assumption of Bernstein was that parents coupled randomly to produce offspring. This paper shows that a simple model of non-random mating, which nevertheless embodies a feature of the Hardy-Weinberg Law, can produce Mendelian coefficients of heredity while maintaining the population distribution. How W. Johannsen's monograph influenced Bernstein is discussed. PMID:22888285

  10. Distributive Education. Physical Distribution. Instructor's Curriculum.

    ERIC Educational Resources Information Center

    Missouri Univ., Columbia. Instructional Materials Lab.

    This distributive education performance-based instructional unit is designed to help students understand the system of physical distribution and to act as an aid to guiding students in preparing for future careers in the transportation industry dealing with the retail, wholesale, and service occupations. (Physical distribution involves the moving…

  11. A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses.

    PubMed

    Riday, Heathcliffe; Smith, Mark A; Peel, Michael D

    2015-09-01

    A simple Weibull distribution based empirical model that predicts pollen-parent fecundity distributions based on polycross size alone has been developed in outbred forage legume species for incorporation into quantitative genetic theory. Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact, although a large body of empirical work shows that this is often not the case in nature. Models have been developed to explain many non-random mating phenomena. This paper measured pollen-parent fecundity distributions among outbred perennial forage legume species [autotetraploid alfalfa (Medicago sativa L.), autohexaploid kura clover (Trifolium ambiguum M. Bieb.), and diploid red clover (Trifolium pratense L.)] in ten polycrosses ranging in size (N) from 9 to 94 pollinated with bee pollinators [Bumble Bees (Bombus impatiens Cr.) and leafcutter bees (Megachile rotundata F.)]. A Weibull distribution best fit the observed pollen-parent fecundity distributions. After standardizing data among the 10 polycrosses, a single Weibull distribution-based model was obtained with an R (2) of 0.978. The model is able to predict pollen-parent fecundity distributions based on polycross size alone. The model predicts that the effective polycross size will be approximately 9 % smaller than under random mating (i.e., N e/N ~ 0.91). The model is simple and can easily be incorporated into other models or simulations requiring a pollen-parent fecundity distribution. Further work is needed to determine how widely applicable the model is. PMID:26105686

  12. Amyloplasts That Sediment in Protonemata of the Moss Ceratodon purpureus Are Nonrandomly Distributed in Microgravity1

    PubMed Central

    Kern, Volker D.; Smith, Jeffrey D.; Schwuchow, Jochen M.; Sack, Fred D.

    2001-01-01

    Little is known about whether or how plant cells regulate the position of heavy organelles that sediment toward gravity. Dark-grown protonemata of the moss Ceratodon purpureus displays a complex plastid zonation in that only some amyloplasts sediment along the length of the tip cell. If gravity is the major force determining the position of amyloplasts that sediment, then these plastids should be randomly distributed in space. Instead, amyloplasts were clustered in the subapical region in microgravity. Cells rotated on a clinostat on earth had a roughly similar non-random plastid distribution. Subapical clusters were also found in ground controls that were inverted and kept stationary, but the distribution profile differed considerably due to amyloplast sedimentation. These findings indicate the existence of as yet unknown endogenous forces and mechanisms that influence amyloplast position and that are normally masked in stationary cells grown on earth. It is hypothesized that a microtubule-based mechanism normally compensates for g-induced drag while still allowing for regulated amyloplast sedimentation. PMID:11299388

  13. Distribution and Habitat Specificity of Potentially-Toxic Microcystis across Climate, Land, and Water Use Gradients

    PubMed Central

    Marmen, Sophi; Aharonovich, Dikla; Grossowicz, Michal; Blank, Lior; Yacobi, Yosef Z.; Sher, Daniel J.

    2016-01-01

    Toxic cyanobacterial blooms are a growing threat to freshwater bodies worldwide. In order for a toxic bloom to occur, a population of cells with the genetic capacity to produce toxins must be present together with the appropriate environmental conditions. In this study, we investigated the distribution patterns and phylogeny of potentially-toxic Microcystis (indicated by the presence and/or phylogeny of the mcyD and mcyA genes). Samples were collected from the water column of almost 60 water bodies across widely differing gradients of environmental conditions and land use in Israel. Potentially, toxic populations were common but not ubiquitous, detected in ~65% of the studied sites. Local environmental factors, including phosphorus and ammonia concentrations and pH, as well as regional conditions such as the distance from built areas and nature reserves, were correlated with the distribution of the mcyD gene. A specific phylogenetic clade of Microcystis, defined using the sequence of the mcyA gene, was preferentially associated with aquaculture facilities but not irrigation reservoirs. Our results reveal important environmental, geospatial, and land use parameters affecting the geographic distribution of toxinogenic Microcystis, suggesting non-random dispersal of these globally abundant toxic cyanobacteria. PMID:27014200

  14. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  15. Amyloplasts that sediment in protonemata of the moss Ceratodon purpureus are nonrandomly distributed in microgravity

    NASA Technical Reports Server (NTRS)

    Kern, V. D.; Smith, J. D.; Schwuchow, J. M.; Sack, F. D.

    2001-01-01

    Little is known about whether or how plant cells regulate the position of heavy organelles that sediment toward gravity. Dark-grown protonemata of the moss Ceratodon purpureus displays a complex plastid zonation in that only some amyloplasts sediment along the length of the tip cell. If gravity is the major force determining the position of amyloplasts that sediment, then these plastids should be randomly distributed in space. Instead, amyloplasts were clustered in the subapical region in microgravity. Cells rotated on a clinostat on earth had a roughly similar non-random plastid distribution. Subapical clusters were also found in ground controls that were inverted and kept stationary, but the distribution profile differed considerably due to amyloplast sedimentation. These findings indicate the existence of as yet unknown endogenous forces and mechanisms that influence amyloplast position and that are normally masked in stationary cells grown on earth. It is hypothesized that a microtubule-based mechanism normally compensates for g-induced drag while still allowing for regulated amyloplast sedimentation.

  16. Virio- and Bacterioplankton Microscale Distributions at the Sediment-Water Interface

    PubMed Central

    Dann, Lisa M.; Mitchell, James G.; Speck, Peter G.; Newton, Kelly; Jeffries, Thomas; Paterson, James

    2014-01-01

    The marine sediment-water interface is an important location for microbially controlled nutrient and gas exchange processes. While microbial distributions on the sediment side of the interface are well established in many locations, the distributions of microbes on the water side of the interface are less well known. Here, we measured that distribution for marine virio- and bacterioplankton with a new two-dimensional technique. Our results revealed higher heterogeneity in sediment-water interface biomass distributions than previously reported with a greater than 45– and 2500-fold change cm−1 found within bacterial and viral subpopulations compared to previous maxima of 1.5- and 1.4-fold cm−1 in bacteria and viruses in the same environments. The 45-fold and 2500-fold changes were due to patches of elevated and patches of reduced viral and bacterial abundance. The bacterial and viral hotspots were found over single and multiple sample points and the two groups often coincided whilst the coldspots only occurred over single sample points and the bacterial and viral abundances showed no correlation. The total mean abundances of viruses strongly correlated with bacteria (r = 0.90, p<0.0001, n = 12) for all three microplates (n = 1350). Spatial autocorrelation analysis via Moran’s I and Geary’s C revealed non-random distributions in bacterial subpopulations and random distributions in viral subpopulations. The variable distributions of viral and bacterial abundance over centimetre-scale distances suggest that competition and the likelihood of viral infection are higher in the small volumes important for individual cell encounters than bulk measurements indicate. We conclude that large scale measurements are not an accurate measurement of the conditions under which microbial dynamics exist. The high variability we report indicates that few microbes experience the ‘average’ concentrations that are frequently measured. PMID:25057797

  17. Conserved Noncoding Elements Follow Power-Law-Like Distributions in Several Genomes as a Result of Genome Dynamics

    PubMed Central

    Polychronopoulos, Dimitris; Sellis, Diamantis; Almirantis, Yannis

    2014-01-01

    Conserved, ultraconserved and other classes of constrained elements (collectively referred as CNEs here), identified by comparative genomics in a wide variety of genomes, are non-randomly distributed across chromosomes. These elements are defined using various degrees of conservation between organisms and several thresholds of minimal length. We here investigate the chromosomal distribution of CNEs by studying the statistical properties of distances between consecutive CNEs. We find widespread power-law-like distributions, i.e. linearity in double logarithmic scale, in the inter-CNE distances, a feature which is connected with fractality and self-similarity. Given that CNEs are often found to be spatially associated with genes, especially with those that regulate developmental processes, we verify by appropriate gene masking that a power-law-like pattern emerges irrespectively of whether elements found close or inside genes are excluded or not. An evolutionary model is put forward for the understanding of these findings that includes segmental or whole genome duplication events and eliminations (loss) of most of the duplicated CNEs. Simulations reproduce the main features of the observed size distributions. Power-law-like patterns in the genomic distributions of CNEs are in accordance with current knowledge about their evolutionary history in several genomes. PMID:24787386

  18. Ethnic differences in body fat distribution among Asian pre-pubertal children: A cross-sectional multicenter study

    PubMed Central

    2011-01-01

    Background Ethnic differences in body fat distribution contribute to ethnic differences in cardiovascular morbidities and diabetes. However few data are available on differences in fat distribution in Asian children from various backgrounds. Therefore, the current study aimed to explore ethnic differences in body fat distribution among Asian children from four countries. Methods A total of 758 children aged 8-10 y from China, Lebanon, Malaysia and Thailand were recruited using a non-random purposive sampling approach to enrol children encompassing a wide BMI range. Height, weight, waist circumference (WC), fat mass (FM, derived from total body water [TBW] estimation using the deuterium dilution technique) and skinfold thickness (SFT) at biceps, triceps, subscapular, supraspinale and medial calf were collected. Results After controlling for height and weight, Chinese and Thai children had a significantly higher WC than their Lebanese and Malay counterparts. Chinese and Thais tended to have higher trunk fat deposits than Lebanese and Malays reflected in trunk SFT, trunk/upper extremity ratio or supraspinale/upper extremity ratio after adjustment for age and total body fat. The subscapular/supraspinale skinfold ratio was lower in Chinese and Thais compared with Lebanese and Malays after correcting for trunk SFT. Conclusions Asian pre-pubertal children from different origins vary in body fat distribution. These results indicate the importance of population-specific WC cut-off points or other fat distribution indices to identify the population at risk of obesity-related health problems. PMID:21703012

  19. Annual Coal Distribution

    EIA Publications

    2016-01-01

    The Annual Coal Distribution Report (ACDR) provides detailed information on domestic coal distribution by origin state, destination state, consumer category, and method of transportation. Also provided is a summary of foreign coal distribution by coal-producing state. All data for the report year are final and this report supersedes all data in the quarterly distribution reports.

  20. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2013-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold distribution procedure. The fold distribution provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of change in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Distribution, Proposal 13149, as Cycle 20.

  1. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2012-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold distribution procedure. The fold distribution provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of change in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Distribution, Proposal 12778, as Cycle 19.

  2. Distributed Data Management and Distributed File Systems

    NASA Astrophysics Data System (ADS)

    Girone, Maria

    2015-12-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  3. Distribution of 45S rDNA sites in chromosomes of plants: Structural and evolutionary implications

    PubMed Central

    2012-01-01

    Background 45S rDNA sites are the most widely documented chromosomal regions in eukaryotes. The analysis of the distribution of these sites along the chromosome in several genera has suggested some bias in their distribution. In order to evaluate if these loci are in fact non-randomly distributed and what is the influence of some chromosomal and karyotypic features on the distribution of these sites, a database was built with the position and number of 45S rDNA sites obtained by FISH together with other karyotypic data from 846 plant species. Results In angiosperms the most frequent numbers of sites per diploid karyotype were two and four, suggesting that in spite of the wide dispersion capacity of these sequences the number of rDNA sites tends to be restricted. The sites showed a preferential distribution on the short arms, mainly in the terminal regions. Curiously, these sites were frequently found on the short arms of acrocentric chromosomes where they usually occupy the whole arm. The trend to occupy the terminal region is especially evident in holokinetic chromosomes, where all of them were terminally located. In polyploids there is a trend towards reduction in the number of sites per monoploid complement. In gymnosperms, however, the distribution of rDNA sites varied strongly among the sampled families. Conclusions The location of 45S rDNA sites do not vary randomly, occurring preferentially on the short arm and in the terminal region of chromosomes in angiosperms. The meaning of this preferential location is not known, but some hypotheses are considered and the observed trends are discussed. PMID:23181612

  4. Exponentiated power Lindley distribution.

    PubMed

    Ashour, Samir K; Eltehiwy, Mahmoud A

    2015-11-01

    A new generalization of the Lindley distribution is recently proposed by Ghitany et al. [1], called as the power Lindley distribution. Another generalization of the Lindley distribution was introduced by Nadarajah et al. [2], named as the generalized Lindley distribution. This paper proposes a more generalization of the Lindley distribution which generalizes the two. We refer to this new generalization as the exponentiated power Lindley distribution. The new distribution is important since it contains as special sub-models some widely well-known distributions in addition to the above two models, such as the Lindley distribution among many others. It also provides more flexibility to analyze complex real data sets. We study some statistical properties for the new distribution. We discuss maximum likelihood estimation of the distribution parameters. Least square estimation is used to evaluate the parameters. Three algorithms are proposed for generating random data from the proposed distribution. An application of the model to a real data set is analyzed using the new distribution, which shows that the exponentiated power Lindley distribution can be used quite effectively in analyzing real lifetime data. PMID:26644927

  5. Exponentiated power Lindley distribution

    PubMed Central

    Ashour, Samir K.; Eltehiwy, Mahmoud A.

    2014-01-01

    A new generalization of the Lindley distribution is recently proposed by Ghitany et al. [1], called as the power Lindley distribution. Another generalization of the Lindley distribution was introduced by Nadarajah et al. [2], named as the generalized Lindley distribution. This paper proposes a more generalization of the Lindley distribution which generalizes the two. We refer to this new generalization as the exponentiated power Lindley distribution. The new distribution is important since it contains as special sub-models some widely well-known distributions in addition to the above two models, such as the Lindley distribution among many others. It also provides more flexibility to analyze complex real data sets. We study some statistical properties for the new distribution. We discuss maximum likelihood estimation of the distribution parameters. Least square estimation is used to evaluate the parameters. Three algorithms are proposed for generating random data from the proposed distribution. An application of the model to a real data set is analyzed using the new distribution, which shows that the exponentiated power Lindley distribution can be used quite effectively in analyzing real lifetime data. PMID:26644927

  6. Spatial distribution of psychotic disorders in an urban area of France: an ecological study

    PubMed Central

    Pignon, Baptiste; Schürhoff, Franck; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Saba, Ghassen; Leboyer, Marion; Kirkbride, James B.; Szöke, Andrei

    2016-01-01

    Previous analyses of neighbourhood variations of non-affective psychotic disorders (NAPD) have focused mainly on incidence. However, prevalence studies provide important insights on factors associated with disease evolution as well as for healthcare resource allocation. This study aimed to investigate the distribution of prevalent NAPD cases in an urban area in France. The number of cases in each neighbourhood was modelled as a function of potential confounders and ecological variables, namely: migrant density, economic deprivation and social fragmentation. This was modelled using statistical models of increasing complexity: frequentist models (using Poisson and negative binomial regressions), and several Bayesian models. For each model, assumptions validity were checked and compared as to how this fitted to the data, in order to test for possible spatial variation in prevalence. Data showed significant overdispersion (invalidating the Poisson regression model) and residual autocorrelation (suggesting the need to use Bayesian models). The best Bayesian model was Leroux’s model (i.e. a model with both strong correlation between neighbouring areas and weaker correlation between areas further apart), with economic deprivation as an explanatory variable (OR = 1.13, 95% CI [1.02–1.25]). In comparison with frequentist methods, the Bayesian model showed a better fit. The number of cases showed non-random spatial distribution and was linked to economic deprivation. PMID:27189529

  7. Proximity within interphase chromosome contributes to the breakpoint distribution in radiation-induced intrachromosomal exchanges

    NASA Astrophysics Data System (ADS)

    Zhang, Ye; Uhlemeyer, Jimmy; Hada, Megumi; Asaithamby, A.; Chen, David J.; Wu, Honglu

    2014-07-01

    Previously, we reported that breaks involved in chromosome aberrations were clustered in several regions of chromosome 3 in human mammary epithelial cells after exposures to either low- or high-LET radiation. In particular, breaks in certain regions of the chromosome tended to rejoin with each other to form an intrachromosome exchange event. This study tests the hypothesis that proximity within a single chromosome in interphase cell nuclei contributes to the distribution of radiation-induced chromosome breaks. Chromosome 3 in G1 human mammary epithelial cells was hybridized with the multicolor banding in situ hybridization (mBAND) probes that distinguish the chromosome in six differently colored regions, and the location of these regions was measured with a laser confocal microscope. Results of the study indicated that, on a multi-mega base pair scale of the DNA, the arrangement of chromatin was non-random. Both telomere regions tended to be located towards the exterior of the chromosome domain, whereas the centromere region towards the interior. In addition, the interior of the chromosome domain was preferentially occupied by the p-arm of the chromatin, which is consistent with our previous finding of intrachromosome exchanges involving breaks on the p-arm and in the centromere region of chromosome 3. Other factors, such as the fragile sites in the 3p21 band and gene regulation, may also contribute to the breakpoint distribution in radiation-induced chromosome aberrations.

  8. Spatial distribution of psychotic disorders in an urban area of France: an ecological study.

    PubMed

    Pignon, Baptiste; Schürhoff, Franck; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Saba, Ghassen; Leboyer, Marion; Kirkbride, James B; Szöke, Andrei

    2016-01-01

    Previous analyses of neighbourhood variations of non-affective psychotic disorders (NAPD) have focused mainly on incidence. However, prevalence studies provide important insights on factors associated with disease evolution as well as for healthcare resource allocation. This study aimed to investigate the distribution of prevalent NAPD cases in an urban area in France. The number of cases in each neighbourhood was modelled as a function of potential confounders and ecological variables, namely: migrant density, economic deprivation and social fragmentation. This was modelled using statistical models of increasing complexity: frequentist models (using Poisson and negative binomial regressions), and several Bayesian models. For each model, assumptions validity were checked and compared as to how this fitted to the data, in order to test for possible spatial variation in prevalence. Data showed significant overdispersion (invalidating the Poisson regression model) and residual autocorrelation (suggesting the need to use Bayesian models). The best Bayesian model was Leroux's model (i.e. a model with both strong correlation between neighbouring areas and weaker correlation between areas further apart), with economic deprivation as an explanatory variable (OR = 1.13, 95% CI [1.02-1.25]). In comparison with frequentist methods, the Bayesian model showed a better fit. The number of cases showed non-random spatial distribution and was linked to economic deprivation. PMID:27189529

  9. Cumulative Poisson Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  10. Doubly Distributed Transactions

    Energy Science and Technology Software Center (ESTSC)

    2014-08-25

    Doubly Distributed Transactions (D2T) offers a technique for managing operations from a set of parallel clients with a collection of distributed services. It detects and manages faults. Example code with a test harness is also provided

  11. Distribution of Chromosome Breakpoints in Human Epithelial Cells Exposed to Low- and High-LET Radiations

    NASA Technical Reports Server (NTRS)

    Hada, Megumi; Cucinotta, Francis; Wu, Honglu

    2009-01-01

    The advantage of the multicolor banding in situ hybridization (mBAND) technique is not only its ability to identify simultaneously both inter- and intrachromosome exchanges, but also the ability to measure the breakpoint location along the length of the chromosome in a precision that is unmatched with other traditional banding techniques. Breakpoints on specific regions of a chromosome have been known to associate with specific cancers. The breakpoint distribution in cells after low- and high-LET radiation exposures will also provide the data for biophysical modeling of the chromatin structure, as well as the data for the modeling the formation of radiation-induced chromosome aberrations. In a series of experiments, we studied low- and high-LET radiation-induced chromosome aberrations using the mBAND technique with chromosome 3 painted in 23 different colored bands. Human epithelial cells (CH1 84B5F5/M10) were exposed in vitro to Cs- 137 rays at both low and high dose rates, secondary neutrons with a broad energy spectrum at a low dose rate and 600 MeV/u Fe ions at a high dose rate. The data of both inter- and intrachromosome aberrations involving the painted chromosome have been reported previously. Here we present data of the location of the chromosome breaks along the length of chromosome 3 in the cells after exposures to each of the four radiation scenarios. In comparison to the expected breakpoint distribution based on the length of the bands, the observed distribution appeared to be non-random for both the low- and high-LET radiations. In particular, hot spots towards both ends of the chromosome were found after low-LET irradiations of either low or high dose rates. For both high-LET radiation types (Fe ions and neutrons), the breakpoint distributions were similar, and were much smoother than that for low-LET radiation. The dependence of the breakpoint distribution on the radiation quality requires further investigations.

  12. Distributed Learning Metadata Standards

    ERIC Educational Resources Information Center

    McClelland, Marilyn

    2004-01-01

    Significant economies can be achieved in distributed learning systems architected with a focus on interoperability and reuse. The key building blocks of an efficient distributed learning architecture are the use of standards and XML technologies. The goal of plug and play capability among various components of a distributed learning system…

  13. Video Distribution Systems.

    ERIC Educational Resources Information Center

    Davoust, David

    1994-01-01

    Describes video distribution systems as a way of giving control of all monitors in a classroom to the teacher. Examples of their use are given, including distribution in language labs and distribution from a media lab to classrooms throughout a school building; and information about five vendors is included. (LRW)

  14. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  15. FRIB cryogenic distribution system

    NASA Astrophysics Data System (ADS)

    Ganni, V.; Dixon, K.; Laverdure, N.; Knudsen, P.; Arenius, D.; Barrios, M.; Jones, S.; Johnson, M.; Casagrande, F.

    2014-01-01

    The Michigan State University Facility for Rare Isotope Beams (MSU-FRIB) helium distribution system has been revised to include bayonet/warm valve type disconnects between each cryomodule and the transfer line distribution system, similar to the Thomas Jefferson National Accelerator Facility (JLab) and the Spallation Neutron Source (SNS) cryogenic distribution systems. The heat loads at various temperature levels and some of the features in the design of the distribution system are outlined. The present status, the plans for fabrication, and the procurement approach for the helium distribution system are also included.

  16. Cubic-normal distribution

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah; Ho, C. K.

    2015-12-01

    The power-normal distribution given in Yeo and Johnson in year 2000 is a unimodal distribution with wide ranges of skewness and kurtosis. A shortcoming of the power-normal distribution is that the negative and positve parts of the underlying random variable have to be specified by two different expressions of the standard normal random variable. In this paper, we construct a new distribution, called the cubic-normal distribution, via a single polynomial expression in cubic root function. Apart from having the properties which are similar to those of the power-normal distribution, this cubic-normal distribution can be developed into a multivariate version which is more attractive from the theoretical and computational points of view.

  17. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  18. Distributed generation systems model

    SciTech Connect

    Barklund, C.R.

    1994-12-31

    A slide presentation is given on a distributed generation systems model developed at the Idaho National Engineering Laboratory, and its application to a situation within the Idaho Power Company`s service territory. The objectives of the work were to develop a screening model for distributed generation alternatives, to develop a better understanding of distributed generation as a utility resource, and to further INEL`s understanding of utility concerns in implementing technological change.

  19. Cooling water distribution system

    DOEpatents

    Orr, Richard

    1994-01-01

    A passive containment cooling system for a nuclear reactor containment vessel. Disclosed is a cooling water distribution system for introducing cooling water by gravity uniformly over the outer surface of a steel containment vessel using an interconnected series of radial guide elements, a plurality of circumferential collector elements and collector boxes to collect and feed the cooling water into distribution channels extending along the curved surface of the steel containment vessel. The cooling water is uniformly distributed over the curved surface by a plurality of weirs in the distribution channels.

  20. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  1. Distribution of Chinese names

    NASA Astrophysics Data System (ADS)

    Huang, Ding-wei

    2013-03-01

    We present a statistical model for the distribution of Chinese names. Both family names and given names are studied on the same basis. With naive expectation, the distribution of family names can be very different from that of given names. One is affected mostly by genealogy, while the other can be dominated by cultural effects. However, we find that both distributions can be well described by the same model. Various scaling behaviors can be understood as a result of stochastic processes. The exponents of different power-law distributions are controlled by a single parameter. We also comment on the significance of full-name repetition in Chinese population.

  2. Smart distribution systems

    DOE PAGESBeta

    Jiang, Yazhou; Liu, Chen -Ching; Xu, Yin

    2016-04-19

    The increasing importance of system reliability and resilience is changing the way distribution systems are planned and operated. To achieve a distribution system self-healing against power outages, emerging technologies and devices, such as remote-controlled switches (RCSs) and smart meters, are being deployed. The higher level of automation is transforming traditional distribution systems into the smart distribution systems (SDSs) of the future. The availability of data and remote control capability in SDSs provides distribution operators with an opportunity to optimize system operation and control. In this paper, the development of SDSs and resulting benefits of enhanced system capabilities are discussed. Amore » comprehensive survey is conducted on the state-of-the-art applications of RCSs and smart meters in SDSs. Specifically, a new method, called Temporal Causal Diagram (TCD), is used to incorporate outage notifications from smart meters for enhanced outage management. To fully utilize the fast operation of RCSs, the spanning tree search algorithm is used to develop service restoration strategies. Optimal placement of RCSs and the resulting enhancement of system reliability are discussed. Distribution system resilience with respect to extreme events is presented. Furthermore, test cases are used to demonstrate the benefit of SDSs. Active management of distributed generators (DGs) is introduced. Future research in a smart distribution environment is proposed.« less

  3. Distribution and Marketing Syllabus.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    The distributive education program for grades 7 to 12 is organized around three career education phases: the career education phase (grades 7-10), the distributive phase (grade 11), and the competency cluster phase (grade 12). The grade 11 syllabus provides a six-page introduction which covers scheduling, cooperative work experience, the school…

  4. Advanced Distribution Management System

    NASA Astrophysics Data System (ADS)

    Avazov, Artur R.; Sobinova, Liubov A.

    2016-02-01

    This article describes the advisability of using advanced distribution management systems in the electricity distribution networks area and considers premises of implementing ADMS within the Smart Grid era. Also, it gives the big picture of ADMS and discusses the ADMS advantages and functionalities.

  5. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  6. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  7. Groundwater and Distribution Workbook.

    ERIC Educational Resources Information Center

    Ekman, John E.

    Presented is a student manual designed for the Wisconsin Vocational, Technical and Adult Education Groundwater and Distribution Training Course. This program introduces waterworks operators-in-training to basic skills and knowledge required for the operation of a groundwater distribution waterworks facility. Arranged according to the general order…

  8. The Concept of Distribution

    ERIC Educational Resources Information Center

    Wild, Chris

    2006-01-01

    This paper is a personal exploration of where the ideas of "distribution" that we are trying to develop in students come from and are leading to, how they fit together, and where they are important and why. We need to have such considerations in the back of our minds when designing learning experiences. The notion of "distribution" as a lens…

  9. Electrical Distribution Program Guide.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Dept. of Vocational Education.

    This program guide contains the standard electrical distribution curriculum for technical institutes in Georgia. The curriculum encompasses the minimum competencies required for entry-level workers in the electrical distribution field, and in job skills such as construction, maintenance, and repair of overhead and underground electrical…

  10. Metrics for Food Distribution.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in food distribution, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  11. Asymmetry and non-random orientation of the inflight effective beam pattern in the WMAP data

    SciTech Connect

    Chiang, Lung-Yih

    2014-04-20

    Tentative evidence for statistical anisotropy in the Wilkinson Microwave Anisotropy Probe data was alleged to be due to 'insufficient handling of beam asymmetries'. In this paper, we investigate this issue and develop a method to estimate the shape of the inflight effective beam, particularly the asymmetry and azimuthal orientation. We divide the whole map into square patches and exploit the information in the Fourier space. For patches containing bright extragalactic point sources, we can directly estimate their shapes, from which the inflight effective beam can be estimated. For those without, we estimate the pattern from iso-power contours in two-dimensional Fourier space. We show that the inflight effective beam convolving the signal is indeed non-symmetric for most of the sky, and it is not randomly oriented. Around the ecliptic poles, however, the asymmetry is smaller due to the averaging effect from different orientations of the beam from the scan strategy. The orientations of the effective beam with significant asymmetry are parallel to the lines of ecliptic longitude. In the foreground-cleaned Internal Linear Combination map, however, the systematics caused by beam effect is significantly lessened.

  12. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    NASA Astrophysics Data System (ADS)

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M.-C.; Demkowicz, M. J.

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that `super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage.

  13. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    DOE PAGESBeta

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M. -C.; Demkowicz, M. J.

    2016-01-29

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migrationmore » barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that ‘super-sink’ interfaces may be designed by optimizing interface stress fields. Lastly, such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage.« less

  14. A nonparametric spatial model for periodontal data with non-random missingness.

    PubMed

    Reich, Brian J; Bandyopadhyay, Dipankar; Bondell, Howard D

    2013-09-01

    Periodontal disease progression is often quantified by clinical attachment level (CAL) defined as the distance down a tooth's root that is detached from the surrounding bone. Measured at 6 locations per tooth throughout the mouth (excluding the molars), it gives rise to a dependent data set-up. These data are often reduced to a one-number summary, such as the whole mouth average or the number of observations greater than a threshold, to be used as the response in a regression to identify important covariates related to the current state of a subject's periodontal health. Rather than a simple one-number summary, we set forward to analyze all available CAL data for each subject, exploiting the presence of spatial dependence, non-stationarity, and non-normality. Also, many subjects have a considerable proportion of missing teeth which cannot be considered missing at random because periodontal disease is the leading cause of adult tooth loss. Under a Bayesian paradigm, we propose a nonparametric flexible spatial (joint) model of observed CAL and the location of missing tooth via kernel convolution methods, incorporating the aforementioned features of CAL data under a unified framework. Application of this methodology to a data set recording the periodontal health of an African-American population, as well as simulation studies reveal the gain in model fit and inference, and provides a new perspective into unraveling covariate-response relationships in presence of complexities posed by these data. PMID:24288421

  15. Non-random pairing in American kestrels: mate choice versus intra-sexual competition

    USGS Publications Warehouse

    Bortolotti, Gary R.; Iko, William M.

    1992-01-01

    Natural selection may influence the arrangement of individuals into mated pairs through either inter-sexual (mate choice) or intra-sexual selection (competition). A study of the American kestrel, Falco sparverius, in northern Saskatchewan distinguished between these two processes using size as a measure of the bird's competitive ability, and condition (mass scaled to body size) as an index of quality. Both sexes arrive on the study area after spring migration in equal numbers and males establish territories. Males and females that moved among territories at the time of pair formation were not different in size or condition from those that did not move, suggesting that birds were not being displaced by superior competitors, and that females moved to encounter potential mates. Within mated pairs, there was no relationship between a bird's size and the condition of its mate for either sex as would be predicted if intra-sexual competitition explained mating patterns. Instead, there was positive assortative mating by condition, suggesting that both sexes used quality as the criterion in choosing mates. There was no correlation between the sizes of males and females in mated paird. Because there were no differences in size or condition of breeding and non-breeding males, factors other than physical attributes, such as prior experience with the area, may determine a male's success in obtaining a territory. Because females that did not obtain mates were in poorer condition than those that did, males may have rejected poor quality females. The results suggest that intra-sexual competition was not important for pair formation, and that kestrels chose mates on the basis of quality.

  16. Selecting a Sample for Your Experiment: A Non-Random Stratified Sampling Approach

    ERIC Educational Resources Information Center

    Tipton, Elizabeth

    2012-01-01

    The purpose of this paper is to develop a more general method for sample recruitment in experiments that is purposive (not random) and that results in a sample that is compositionally similar to the generalization population. This work builds on Tipton et al. (2011) by offering solutions to a larger class of problems than the non-overlapping…

  17. Non-random localization of ribonucleoprotein (RNP) structures within an adenovirus mRNA precursor.

    PubMed Central

    Ohlsson, R I; van Eekelen, C; Philipson, L

    1982-01-01

    Heterogeneous nuclear protein complexes (hnRNP) containing the precursor RNA from the adenovirus early region 2 were analysed to determine the specificity of protein-RNA interaction. RNA precursor sequences were present in isolated hnRNP complexes and endogenous 30S particles. At least 20-40 bases long fragments were protected when RNase A was used to remove unprotected RNA sequences in hnRNA complexes. Similarly around 40 bases of RNA were protected in 30S particles. These sequences represent discrete regions of the adenovirus genome. Especially sequences complementary to the EcoRI-F fragment encoding the first leader and the major intron for the DNA binding protein (DBP) RNA precursor, were analysed in detail. Tentatively, sequences resistant to RNase A were located in the middle of the intron and at the splice-donor junction of the first leader of the DBP precursor RNA. The same sequences were identified irrespective whether hnRNP complexes or 30S particles were used suggesting that 30S particles originate from hnRNP complexes. A 38.000 dalton protein appears to be in direct contact with RNA sequences complementary to the EcoRI-F fragment. Images PMID:6285286

  18. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient.

    PubMed

    Plass-Johnson, Jeremiah G; Taylor, Marc H; Husain, Aidah A A; Teichberg, Mirta C; Ferse, Sebastian C A

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased degradation of coral reefs is associated with increased variability in fish community functional composition resulting from selective impacts on specific traits, thereby affecting the functional response of these communities to increasing perturbations. PMID:27100189

  19. Non-random walk diffusion enhances the sink strength of semicoherent interfaces.

    PubMed

    Vattré, A; Jourdan, T; Ding, H; Marinica, M-C; Demkowicz, M J

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that 'super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage. PMID:26822632

  20. Communicating the Signal of Climate Change in The Presence of Non-Random Noise

    NASA Astrophysics Data System (ADS)

    Mann, M. E.

    2015-12-01

    The late Stephen Schneider spoke eloquently of the double ethical bind that we face: we must strive to communicate effectively but honestly. This is no simple task given the considerable "noise" generated in our public discourse by vested interests instead working to misinform the public. To do so, we must convey what is known in plainspoken jargon-free language, while acknowledging the real uncertainties that exist. Further, we must explain the implications of those uncertainties, which in many cases imply the possibility of greater, not lesser, risk. Finally, we must not be averse to discussing the policy implications of the science, lest we fail to provide our audience with critical information that can help them make informed choices about their own actions as citizens. I will use examples from my current collaboration with Washington Post editorial cartoonist Tom Toles.

  1. Non-random food-web assembly at habitat edges increases connectivity and functional redundancy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Habitat fragmentation dramatically alters the spatial configuration of landscapes, with the creation of artificial edges affecting community structure and species interactions. Despite this, it is not known how the different food-webs in adjacent habitats merge at their boundaries, and what the cons...

  2. Non-random walk diffusion enhances the sink strength of semicoherent interfaces

    PubMed Central

    Vattré, A.; Jourdan, T.; Ding, H.; Marinica, M.-C.; Demkowicz, M. J.

    2016-01-01

    Clean, safe and economical nuclear energy requires new materials capable of withstanding severe radiation damage. One strategy of imparting radiation resistance to solids is to incorporate into them a high density of solid-phase interfaces capable of absorbing and annihilating radiation-induced defects. Here we show that elastic interactions between point defects and semicoherent interfaces lead to a marked enhancement in interface sink strength. Our conclusions stem from simulations that integrate first principles, object kinetic Monte Carlo and anisotropic elasticity calculations. Surprisingly, the enhancement in sink strength is not due primarily to increased thermodynamic driving forces, but rather to reduced defect migration barriers, which induce a preferential drift of defects towards interfaces. The sink strength enhancement is highly sensitive to the detailed character of interfacial stresses, suggesting that ‘super-sink' interfaces may be designed by optimizing interface stress fields. Such interfaces may be used to create materials with unprecedented resistance to radiation-induced damage. PMID:26822632

  3. Comparison of vaginal and abdominal hysterectomy:A prospective non-randomized trial

    PubMed Central

    Chen, Bing; Ren, Dong-Ping; Li, Jing-Xuan; Li, Chun-Dong

    2014-01-01

    Objective: To compare outcomes of vaginal and abdominal hysterectomy procedures in women with benign gynaecological diseases. Methods: This was a prospective study of outcomes of consecutive patients who underwent total vaginal hysterectomy (VH) or abdominal hysterectomy (AH) for benign gynaecological diseases. Patient characteristics before, during, and after the operations were reviewed. Patients were followed up for three months to evaluate postoperative complications. Results: This study included a total of 313 patients. 143 patients underwent AH and 170 patients underwent VH. Baseline characteristics were similar between the two groups. There were no intraoperative complications in either group. Operation time, intraoperative blood loss, first postoperative flatus time, time to out-of-bed activity, mean maximum postoperative body temperature, and duration of fever were all significantly shorter and less severe in the VH group compared with the AH group. In addition, vaginal length in the VH group was significantly shorter than in the AH group. Conclusions: Vaginal hysterectomy has advantages over AH in the treatment of benign gynaecological diseases, providing greater efficacy and safety with minimal invasiveness. PMID:25097536

  4. Non-Random Sibling Cannibalism in the Marine Gastropod Crepidula coquimbensis

    PubMed Central

    Brante, Antonio; Fernández, Miriam; Viard, Frédérique

    2013-01-01

    Sibling cannibalism is commonly observed in marine species. For instance, intrabrood cannibalism has been documented in marine gastropods with direct development, suggesting a relationship between embryo behavior and the evolution of life history strategies. However, there has been little effort to document the factors driving sibling cannibalism in marine species. The kin selection theory suggests that the level of relatedness plays an important role in cannibalism patterns. We examined Crepidula coquimbensis, a marine gastropod that broods and encloses its brooded offspring in capsules. Encapsulated embryos show sibling cannibalism and high levels of intracapsular multiple paternity. Given these features, cannibalistic behavior may be driven by kin-relatedness. To test this hypothesis, we constructed artificial aggregations of embryos to mimic three levels of relatedness: high, medium and low. For each category of aggregation, the cannibalism rate and benefits (i.e. size at hatching of surviving offspring) were estimated. In addition, at the end of embryo development, we performed parentage analyses to determine if cannibalism was associated with the relatedness between cannibal and victim embryos. Our results show that the intensity of sibling cannibalism increased in aggregations characterized by the lowest level of relatedness. There were important benefits of cannibalism in terms of hatching cannibal size. In addition, cannibalism between embryos was not random: the variation in reproductive success between males increased over the course of the experiment and the effective number of fathers decreased. Altogether, these results suggest that polyandry may play an important role in the evolution of sibling cannibalism in C. coquimbensis and that kin selection may operate during early embryonic stages in this species. PMID:23805291

  5. Non-Random Sibling Cannibalism in the Marine Gastropod Crepidula coquimbensis.

    PubMed

    Brante, Antonio; Fernández, Miriam; Viard, Frédérique

    2013-01-01

    Sibling cannibalism is commonly observed in marine species. For instance, intrabrood cannibalism has been documented in marine gastropods with direct development, suggesting a relationship between embryo behavior and the evolution of life history strategies. However, there has been little effort to document the factors driving sibling cannibalism in marine species. The kin selection theory suggests that the level of relatedness plays an important role in cannibalism patterns. We examined Crepidula coquimbensis, a marine gastropod that broods and encloses its brooded offspring in capsules. Encapsulated embryos show sibling cannibalism and high levels of intracapsular multiple paternity. Given these features, cannibalistic behavior may be driven by kin-relatedness. To test this hypothesis, we constructed artificial aggregations of embryos to mimic three levels of relatedness: high, medium and low. For each category of aggregation, the cannibalism rate and benefits (i.e. size at hatching of surviving offspring) were estimated. In addition, at the end of embryo development, we performed parentage analyses to determine if cannibalism was associated with the relatedness between cannibal and victim embryos. Our results show that the intensity of sibling cannibalism increased in aggregations characterized by the lowest level of relatedness. There were important benefits of cannibalism in terms of hatching cannibal size. In addition, cannibalism between embryos was not random: the variation in reproductive success between males increased over the course of the experiment and the effective number of fathers decreased. Altogether, these results suggest that polyandry may play an important role in the evolution of sibling cannibalism in C. coquimbensis and that kin selection may operate during early embryonic stages in this species. PMID:23805291

  6. Skin-impedance in Fabry Disease: A prospective, controlled, non-randomized clinical study

    PubMed Central

    Gupta, Surya N; Ries, Markus; Murray, Gary J; Quirk, Jane M; Brady, Roscoe O; Lidicker, Jeffrey R; Schiffmann, Raphael; Moore, David F

    2008-01-01

    Background We previously demonstrated improved sweating after enzyme replacement therapy (ERT) in Fabry disease using the thermo-regularity sweat and quantitative sudomotor axon reflex tests. Skin-impedance, a measure skin-moisture (sweating), has been used in the clinical evaluation of burns and pressure ulcers using the portable dynamic dermal impedance monitor (DDIM) system. Methods We compared skin impedance measurements in hemizygous patients with Fabry disease (22 post 3-years of bi-weekly ERT and 5 ERT naive) and 22 healthy controls. Force compensated skin-moisture values were used for statistical analysis. Outcome measures included 1) moisture reading of the 100th repetitive reading, 2) rate of change, 3) average of 60–110th reading and 4) overall average of all readings. Results All outcome measures showed a significant difference in skin-moisture between Fabry patients and control subjects (p < 0.0001). There was no difference between Fabry patients on ERT and patients naïve to ERT. Increased skin-impedance values for the four skin-impedance outcome measures were found in a small number of dermatome test-sites two days post-enzyme infusions. Conclusion The instrument portability, ease of its use, a relatively short time required for the assessment, and the fact that DDIM system was able to detect the difference in skin-moisture renders the instrument a useful clinical tool. PMID:18990229

  7. On the non-randomness of maximum Lempel Ziv complexity sequences of finite size

    NASA Astrophysics Data System (ADS)

    Estevez-Rams, E.; Lora Serrano, R.; Aragón Fernández, B.; Brito Reyes, I.

    2013-06-01

    Random sequences attain the highest entropy rate. The estimation of entropy rate for an ergodic source can be done using the Lempel Ziv complexity measure yet, the exact entropy rate value is only reached in the infinite limit. We prove that typical random sequences of finite length fall short of the maximum Lempel-Ziv complexity, contrary to common belief. We discuss that, for a finite length, maximum Lempel-Ziv sequences can be built from a well defined generating algorithm, which makes them of low Kolmogorov-Chaitin complexity, quite the opposite to randomness. It will be discussed that Lempel-Ziv measure is, in this sense, less general than Kolmogorov-Chaitin complexity, as it can be fooled by an intelligent enough agent. The latter will be shown to be the case for the binary expansion of certain irrational numbers. Maximum Lempel-Ziv sequences induce a normalization that gives good estimates of entropy rate for several sources, while keeping bounded values for all sequence length, making it an alternative to other normalization schemes in use.

  8. Non-randomized mtDNA damage after ionizing radiation via charge transport

    NASA Astrophysics Data System (ADS)

    Zhou, Xin; Liu, Xinguo; Zhang, Xin; Zhou, Rong; He, Yang; Li, Qiang; Wang, Zhenhua; Zhang, Hong

    2012-10-01

    Although it is well known that there are mutation hot spots in mtDNA, whether there are damage hot spots remain elusive. In this study, the regional DNA damage of mitochondrial genome after ionizing radiation was determined by real-time quantitative PCR. The mtDNA damage level was found to be dose-dependent and regional unequal. The control region was the most susceptible region to oxidative damage. GGG, as an typical hole trap during charge transport, was found to be disproportionally enriched in the control region. A total of 107 vertebrate mitochondrial genomes were then analyzed to testify whether the GGG enrichment in control region was evolutionary conserved. Surprisingly, the triple G enrichment can be observed in most of the homeothermal animals, while the majority of heterothermic animals showed no triple G enrichment. These results indicated that the triple G enrichment in control region was related to the mitochondrial metabolism during evolution.

  9. Non-Random Variability in Functional Composition of Coral Reef Fish Communities along an Environmental Gradient

    PubMed Central

    Plass-Johnson, Jeremiah G.; Taylor, Marc H.; Husain, Aidah A. A.; Teichberg, Mirta C.; Ferse, Sebastian C. A.

    2016-01-01

    Changes in the coral reef complex can affect predator-prey relationships, resource availability and niche utilisation in the associated fish community, which may be reflected in decreased stability of the functional traits present in a community. This is because particular traits may be favoured by a changing environment, or by habitat degradation. Furthermore, other traits can be selected against because degradation can relax the association between fishes and benthic habitat. We characterised six important ecological traits for fish species occurring at seven sites across a disturbed coral reef archipelago in Indonesia, where reefs have been exposed to eutrophication and destructive fishing practices for decades. Functional diversity was assessed using two complementary indices (FRic and RaoQ) and correlated to important environmental factors (live coral cover and rugosity, representing local reef health, and distance from shore, representing a cross-shelf environmental gradient). Indices were examined for both a change in their mean, as well as temporal (short-term; hours) and spatial (cross-shelf) variability, to assess whether fish-habitat association became relaxed along with habitat degradation. Furthermore, variability in individual traits was examined to identify the traits that are most affected by habitat change. Increases in the general reef health indicators, live coral cover and rugosity (correlated with distance from the mainland), were associated with decreases in the variability of functional diversity and with community-level changes in the abundance of several traits (notably home range size, maximum length, microalgae, detritus and small invertebrate feeding and reproductive turnover). A decrease in coral cover increased variability of RaoQ while rugosity and distance both inversely affected variability of FRic; however, averages for these indices did not reveal patterns associated with the environment. These results suggest that increased degradation of coral reefs is associated with increased variability in fish community functional composition resulting from selective impacts on specific traits, thereby affecting the functional response of these communities to increasing perturbations. PMID:27100189

  10. Combining randomized and non-randomized evidence in clinical research: a review of methods and applications.

    PubMed

    Verde, Pablo E; Ohmann, Christian

    2015-03-01

    Researchers may have multiple motivations for combining disparate pieces of evidence in a meta-analysis, such as generalizing experimental results or increasing the power to detect an effect that a single study is not able to detect. However, while in meta-analysis, the main question may be simple, the structure of evidence available to answer it may be complex. As a consequence, combining disparate pieces of evidence becomes a challenge. In this review, we cover statistical methods that have been used for the evidence-synthesis of different study types with the same outcome and similar interventions. For the methodological review, a literature retrieval in the area of generalized evidence-synthesis was performed, and publications were identified, assessed, grouped and classified. Furthermore real applications of these methods in medicine were identified and described. For these approaches, 39 real clinical applications could be identified. A new classification of methods is provided, which takes into account: the inferential approach, the bias modeling, the hierarchical structure, and the use of graphical modeling. We conclude with a discussion of pros and cons of our approach and give some practical advice. PMID:26035469

  11. Brief Report: Non-Random X Chromosome Inactivation in Females with Autism

    ERIC Educational Resources Information Center

    Talebizadeh, Z.; Bittel, D. C.; Veatch, O. J.; Kibiryeva, N.; Butler, M. G.

    2005-01-01

    Autism is a heterogeneous neurodevelopmental disorder with a 3-4 times higher sex ratio in males than females. X chromosome genes may contribute to this higher sex ratio through unusual skewing of X chromosome inactivation. We studied X chromosome skewness in 30 females with classical autism and 35 similarly aged unaffected female siblings as…

  12. 24 CFR 203.423 - Distribution of distributive shares.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Distribution of distributive shares... and Distributive Shares § 203.423 Distribution of distributive shares. (a) The Commissioner may provide for the distribution to the mortgagor of a share of the participating reserve account if...

  13. 24 CFR 203.423 - Distribution of distributive shares.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Distribution of distributive shares... and Distributive Shares § 203.423 Distribution of distributive shares. (a) The Commissioner may provide for the distribution to the mortgagor of a share of the participating reserve account if...

  14. Spatial and Seasonal Dynamic of Abundance and Distribution of Guanaco and Livestock: Insights from Using Density Surface and Null Models

    PubMed Central

    Schroeder, Natalia M.; Matteucci, Silvia D.; Moreno, Pablo G.; Gregorio, Pablo; Ovejero, Ramiro; Taraborelli, Paula; Carmanchahi, Pablo D.

    2014-01-01

    Monitoring species abundance and distribution is a prerequisite when assessing species status and population viability, a difficult task to achieve for large herbivores at ecologically meaningful scales. Co-occurrence patterns can be used to infer mechanisms of community organization (such as biotic interactions), although it has been traditionally applied to binary presence/absence data. Here, we combine density surface and null models of abundance data as a novel approach to analyze the spatial and seasonal dynamics of abundance and distribution of guanacos (Lama guanicoe) and domestic herbivores in northern Patagonia, in order to visually and analytically compare the dispersion and co-occurrence pattern of ungulates. We found a marked seasonal pattern in abundance and spatial distribution of L. guanicoe. The guanaco population reached its maximum annual size and spatial dispersion in spring-summer, decreasing up to 6.5 times in size and occupying few sites of the study area in fall-winter. These results are evidence of the seasonal migration process of guanaco populations, an increasingly rare event for terrestrial mammals worldwide. The maximum number of guanacos estimated for spring (25951) is higher than the total population size (10000) 20 years ago, probably due to both counting methodology and population growth. Livestock were mostly distributed near human settlements, as expected by the sedentary management practiced by local people. Herbivore distribution was non-random; i.e., guanaco and livestock abundances co-varied negatively in all seasons, more than expected by chance. Segregation degree of guanaco and small-livestock (goats and sheep) was comparatively stronger than that of guanaco and large-livestock, suggesting a competition mechanism between ecologically similar herbivores, although various environmental factors could also contribute to habitat segregation. The new and compelling combination of methods used here is highly useful for researchers

  15. Distributed Propulsion Vehicles

    NASA Technical Reports Server (NTRS)

    Kim, Hyun Dae

    2010-01-01

    Since the introduction of large jet-powered transport aircraft, the majority of these vehicles have been designed by placing thrust-generating engines either under the wings or on the fuselage to minimize aerodynamic interactions on the vehicle operation. However, advances in computational and experimental tools along with new technologies in materials, structures, and aircraft controls, etc. are enabling a high degree of integration of the airframe and propulsion system in aircraft design. The National Aeronautics and Space Administration (NASA) has been investigating a number of revolutionary distributed propulsion vehicle concepts to increase aircraft performance. The concept of distributed propulsion is to fully integrate a propulsion system within an airframe such that the aircraft takes full synergistic benefits of coupling of airframe aerodynamics and the propulsion thrust stream by distributing thrust using many propulsors on the airframe. Some of the concepts are based on the use of distributed jet flaps, distributed small multiple engines, gas-driven multi-fans, mechanically driven multifans, cross-flow fans, and electric fans driven by turboelectric generators. This paper describes some early concepts of the distributed propulsion vehicles and the current turboelectric distributed propulsion (TeDP) vehicle concepts being studied under the NASA s Subsonic Fixed Wing (SFW) Project to drastically reduce aircraft-related fuel burn, emissions, and noise by the year 2030 to 2035.

  16. Distributional Learning of Appearance

    PubMed Central

    Griffin, Lewis D.; Wahab, M. Husni; Newell, Andrew J.

    2013-01-01

    Opportunities for associationist learning of word meaning, where a word is heard or read contemperaneously with information being available on its meaning, are considered too infrequent to account for the rate of language acquisition in children. It has been suggested that additional learning could occur in a distributional mode, where information is gleaned from the distributional statistics (word co-occurrence etc.) of natural language. Such statistics are relevant to meaning because of the Distributional Principle that ‘words of similar meaning tend to occur in similar contexts’. Computational systems, such as Latent Semantic Analysis, have substantiated the viability of distributional learning of word meaning, by showing that semantic similarities between words can be accurately estimated from analysis of the distributional statistics of a natural language corpus. We consider whether appearance similarities can also be learnt in a distributional mode. As grounds for such a mode we advance the Appearance Hypothesis that ‘words with referents of similar appearance tend to occur in similar contexts’. We assess the viability of such learning by looking at the performance of a computer system that interpolates, on the basis of distributional and appearance similarity, from words that it has been explicitly taught the appearance of, in order to identify and name objects that it has not been taught about. Our experiment tests with a set of 660 simple concrete noun words. Appearance information on words is modelled using sets of images of examples of the word. Distributional similarity is computed from a standard natural language corpus. Our computation results support the viability of distributional learning of appearance. PMID:23460927

  17. Multiple distal basin plains reveal a common distribution for large volume turbidity current recurrence intervals

    NASA Astrophysics Data System (ADS)

    Clare, M. A.; Talling, P. J.; Hunt, J.; Challenor, P. G.

    2013-12-01

    independent of the time since the last. Surprisingly, this frequency distribution suggests that non-random processes such as sea level change are not a dominant control on slide frequency. This conclusion is validated by the results of a suite of various statistical analyses including: regression analysis, Generalised Linear Model, rescaled range analysis and Cox proportional hazards model. Finally, we show that the same statistical analyses indicate that there does appear to be a strong relationship between sea level and event magnitude (and hence deposit volume). This is the first study to show that turbidity currents triggered by large submarine landslides may have a common frequency distribution on a basin scale, and that this distribution is temporally random; however, sea level may exert a control on the magnitude of events that travel to distal basinal settings. These findings have important implications for understanding global sediment flux, sequence stratigraphic modelling, and assessment of geohazards posed by submarine landslides and turbidity currents.

  18. Technologies for Distributed Defense

    SciTech Connect

    Seiders, Barbara AB; Rybka, Anthony J.

    2002-07-01

    For Americans, the nature of warfare changed on September 11, 2001. Our national security henceforth will require distributed defense. One extreme of distributed defense is represented by fully deployed military troops responding to a threat from a hostile nation state. At the other extreme is a country of "citizen soldiers," with families and communities securing their common defense through heightened awareness, engagement as good neighbors, and local support of and cooperation with local law enforcement, emergency and health care providers. Technologies - for information exploitation, biological agent detection, health care surveillance, and security - will be critical to ensuring success in distributed defense.

  19. Distributed analysis at LHCb

    NASA Astrophysics Data System (ADS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart; LHCb Collaboration

    2011-12-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  20. Ticks: Geographic Distribution

    MedlinePlus

    ... Atlas. Download this map [PDF - 1 page] Lone star tick ( Amblyomma americanum ) Where found: Widely distributed in ... is distinguished by a white dot or “lone star” on her back. Lone star tick saliva can ...

  1. DOLIB: Distributed Object Library

    SciTech Connect

    D`Azevedo, E.F.; Romine, C.H.

    1994-10-01

    This report describes the use and implementation of DOLIB (Distributed Object Library), a library of routines that emulates global or virtual shared memory on Intel multiprocessor systems. Access to a distributed global array is through explicit calls to gather and scatter. Advantages of using DOLIB include: dynamic allocation and freeing of huge (gigabyte) distributed arrays, both C and FORTRAN callable interfaces, and the ability to mix shared-memory and message-passing programming models for ease of use and optimal performance. DOLIB is independent of language and compiler extensions and requires no special operating system support. DOLIB also supports automatic caching of read-only data for high performance. The virtual shared memory support provided in DOLIB is well suited for implementing Lagrangian particle tracking techniques. We have also used DOLIB to create DONIO (Distributed Object Network I/O Library), which obtains over a 10-fold improvement in disk I/O performance on the Intel Paragon.

  2. Estimating Bias Error Distributions

    NASA Technical Reports Server (NTRS)

    Liu, Tian-Shu; Finley, Tom D.

    2001-01-01

    This paper formulates the general methodology for estimating the bias error distribution of a device in a measuring domain from less accurate measurements when a minimal number of standard values (typically two values) are available. A new perspective is that the bias error distribution can be found as a solution of an intrinsic functional equation in a domain. Based on this theory, the scaling- and translation-based methods for determining the bias error distribution arc developed. These methods are virtually applicable to any device as long as the bias error distribution of the device can be sufficiently described by a power series (a polynomial) or a Fourier series in a domain. These methods have been validated through computational simulations and laboratory calibration experiments for a number of different devices.

  3. Polygamy of distributed entanglement

    NASA Astrophysics Data System (ADS)

    Buscemi, Francesco; Gour, Gilad; Kim, Jeong San

    2009-07-01

    While quantum entanglement is known to be monogamous (i.e., shared entanglement is restricted in multipartite settings), here we show that distributed entanglement (or the potential for entanglement) is by nature polygamous. By establishing the concept of one-way unlocalizable entanglement (UE) and investigating its properties, we provide a polygamy inequality of distributed entanglement in tripartite quantum systems of arbitrary dimension. We also provide a polygamy inequality in multiqubit systems and several trade-offs between UE and other correlation measures.

  4. Generic Distributed Simulation Architecture

    SciTech Connect

    Booker, C.P.

    1999-05-14

    A Generic Distributed Simulation Architecture is described that allows a simulation to be automatically distributed over a heterogeneous network of computers and executed with very little human direction. A prototype Framework is presented that implements the elements of the Architecture and demonstrates the feasibility of the concepts. It provides a basis for a future, improved Framework that will support legacy models. Because the Framework is implemented in Java, it may be installed on almost any modern computer system.

  5. Sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1988-01-01

    Theoretical models of the human brain and proposed neural-network computers are developed analytically. Chapters are devoted to the mathematical foundations, background material from computer science, the theory of idealized neurons, neurons as address decoders, and the search of memory for the best match. Consideration is given to sparse memory, distributed storage, the storage and retrieval of sequences, the construction of distributed memory, and the organization of an autonomous learning system.

  6. Sparse distributed memory

    SciTech Connect

    Kanerva, P.

    1988-01-01

    Theoretical models of the human brain and proposed neural-network computers are developed analytically. Chapters are devoted to the mathematical foundations, background material from computer science, the theory of idealized neurons, neurons as address decoders, and the search of memory for the best match. Consideration is given to sparse memory, distributed storage, the storage and retrieval of sequences, the construction of distributed memory, and the organization of an autonomous learning system. 63 refs.

  7. Distribution system simulator

    NASA Technical Reports Server (NTRS)

    Bahrami, K. A.; Kirkham, H.; Rahman, S.

    1986-01-01

    In a series of tests performed under the Department of Energy auspices, power line carrier propagation was observed to be anomalous under certain circumstances. To investigate the cause, a distribution system simulator was constructed. The simulator was a physical simulator that accurately represented the distribution system from below power frequency to above 50 kHz. Effects such as phase-to-phase coupling and skin effect were modeled. Construction details of the simulator, and experimental results from its use are presented.

  8. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2010-09-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Analysis {11863} during Cycle 17.

  9. STIS MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2011-10-01

    The performance of MAMA microchannel plates can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the STIS MAMA Fold Analysis, Proposal 12416, as Cycle 18.

  10. Polygamy of distributed entanglement

    SciTech Connect

    Buscemi, Francesco; Gour, Gilad; Kim, Jeong San

    2009-07-15

    While quantum entanglement is known to be monogamous (i.e., shared entanglement is restricted in multipartite settings), here we show that distributed entanglement (or the potential for entanglement) is by nature polygamous. By establishing the concept of one-way unlocalizable entanglement (UE) and investigating its properties, we provide a polygamy inequality of distributed entanglement in tripartite quantum systems of arbitrary dimension. We also provide a polygamy inequality in multiqubit systems and several trade-offs between UE and other correlation measures.

  11. Distributed generation hits market

    SciTech Connect

    1997-10-01

    The pace at which vendors are developing and marketing gas turbines and reciprocating engines for small-scale applications may signal the widespread growth of distributed generation. Loosely defined to refer to applications in which power generation equipment is located close to end users who have near-term power capacity needs, distributed generation encompasses a broad range of technologies and load requirements. Disagreement is inevitable, but many industry observers associate distributed generation with applications anywhere from 25 kW to 25 MW. Ten years ago, distributed generation users only represented about 2% of the world market. Today, that figure has increased to about 4 or 5%, and probably could settle in the 20% range within a 3-to-5-year period, according to Michael Jones, San Diego, Calif.-based Solar Turbines Inc. power generation marketing manager. The US Energy Information Administration predicts about 175 GW of generation capacity will be added domestically by 2010. If 20% comes from smaller plants, distributed generation could account for about 35 GW. Even with more competition, it`s highly unlikely distributed generation will totally replace current market structures and central stations. Distributed generation may be best suited for making market inroads when and where central systems need upgrading, and should prove its worth when the system can`t handle peak demands. Typical applications include small reciprocating engine generators at remote customer sites or larger gas turbines to boost the grid. Additional market opportunities include standby capacity, peak shaving, power quality, cogeneration and capacity rental for immediate demand requirements. Integration of distributed generation systems--using gas-fueled engines, gas-fired combustion engines and fuel cells--can upgrade power quality for customers and reduce operating costs for electric utilities.

  12. Distribution of perfusion.

    PubMed

    Glenny, Robb; Robertson, H Thomas

    2011-01-01

    Local driving pressures and resistances within the pulmonary vascular tree determine the distribution of perfusion in the lung. Unlike other organs, these local determinants are significantly influenced by regional hydrostatic and alveolar pressures. Those effects on blood flow distribution are further magnified by the large vertical height of the human lung and the relatively low intravascular pressures in the pulmonary circulation. While the distribution of perfusion is largely due to passive determinants such as vascular geometry and hydrostatic pressures, active mechanisms such as vasoconstriction induced by local hypoxia can also redistribute blood flow. This chapter reviews the determinants of regional lung perfusion with a focus on vascular tree geometry, vertical gradients induced by gravity, the interactions between vascular and surrounding alveolar pressures, and hypoxic pulmonary vasoconstriction. While each of these determinants of perfusion distribution can be examined in isolation, the distribution of blood flow is dynamically determined and each component interacts with the others so that a change in one region of the lung influences the distribution of blood flow in other lung regions. PMID:23737171

  13. Break Point Distribution on Chromosome 3 of Human Epithelial Cells exposed to Gamma Rays, Neutrons and Fe Ions

    NASA Technical Reports Server (NTRS)

    Hada, M.; Saganti, P. B.; Gersey, B.; Wilkins, R.; Cucinotta, F. A.; Wu, H.

    2007-01-01

    Most of the reported studies of break point distribution on the damaged chromosomes from radiation exposure were carried out with the G-banding technique or determined based on the relative length of the broken chromosomal fragments. However, these techniques lack the accuracy in comparison with the later developed multicolor banding in situ hybridization (mBAND) technique that is generally used for analysis of intrachromosomal aberrations such as inversions. Using mBAND, we studied chromosome aberrations in human epithelial cells exposed in vitro to both low or high dose rate gamma rays in Houston, low dose rate secondary neutrons at Los Alamos National Laboratory and high dose rate 600 MeV/u Fe ions at NASA Space Radiation Laboratory. Detailed analysis of the inversion type revealed that all of the three radiation types induced a low incidence of simple inversions. Half of the inversions observed after neutron or Fe ion exposure, and the majority of inversions in gamma-irradiated samples were accompanied by other types of intrachromosomal aberrations. In addition, neutrons and Fe ions induced a significant fraction of inversions that involved complex rearrangements of both inter- and intrachromosome exchanges. We further compared the distribution of break point on chromosome 3 for the three radiation types. The break points were found to be randomly distributed on chromosome 3 after neutrons or Fe ions exposure, whereas non-random distribution with clustering break points was observed for gamma-rays. The break point distribution may serve as a potential fingerprint of high-LET radiation exposure.

  14. 24 CFR 213.278 - Distribution of distributive share.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Distribution of distributive share. 213.278 Section 213.278 Housing and Urban Development Regulations Relating to Housing and Urban... Management Housing Insurance and Distributive Shares § 213.278 Distribution of distributive share. When...

  15. 24 CFR 213.278 - Distribution of distributive share.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Distribution of distributive share. 213.278 Section 213.278 Housing and Urban Development Regulations Relating to Housing and Urban... Management Housing Insurance and Distributive Shares § 213.278 Distribution of distributive share. When...

  16. The isotopic distribution conundrum.

    PubMed

    Valkenborg, Dirk; Mertens, Inge; Lemière, Filip; Witters, Erwin; Burzykowski, Tomasz

    2012-01-01

    Although access to high-resolution mass spectrometry (MS), especially in the field of biomolecular MS, is becoming readily available due to recent advances in MS technology, the accompanied information on isotopic distribution in high-resolution spectra is not used at its full potential, mainly because of lack of knowledge and/or awareness. In this review, we give an insight into the practical problems related to calculating the isotopic distribution for large biomolecules, and present an overview of methods for the calculation of the isotopic distribution. We discuss the key events that triggered the development of various algorithms and explain the rationale of how and why the various isotopic-distribution calculations were performed. The review is focused around the developmental stages as briefly outlined below, starting with the first observation of an isotopic distribution. The observations of Beynon in the field of organic MS that chlorine appeared in a mass spectrum as two variants with odds 3:1 lie at the basis of the first wave of algorithms for the calculation of the isotopic distribution, based on the atomic composition of a molecule. From here on, we explain why more complex biomolecules such as peptides exhibit a highly complex isotope pattern when assayed by MS, and we discuss how combinatorial difficulties complicate the calculation of the isotopic distribution on computers. For this purpose, we highlight three methods, which were introduced in the 1980s. These are the stepwise procedure introduced by Kubinyi, the polynomial expansion from Brownawell and Fillippo, and the multinomial expansion from Yergey. The next development was instigated by Rockwood, who suggested to decompose the isotopic distribution in terms of their nucleon count instead of the exact mass. In this respect, we could claim that the term "aggregated" isotopic distribution is more appropriate. Due to the simplification of the isotopic distribution to its aggregated counterpart

  17. Vaginal drug distribution modeling.

    PubMed

    Katz, David F; Yuan, Andrew; Gao, Yajing

    2015-09-15

    This review presents and applies fundamental mass transport theory describing the diffusion and convection driven mass transport of drugs to the vaginal environment. It considers sources of variability in the predictions of the models. It illustrates use of model predictions of microbicide drug concentration distribution (pharmacokinetics) to gain insights about drug effectiveness in preventing HIV infection (pharmacodynamics). The modeling compares vaginal drug distributions after different gel dosage regimens, and it evaluates consequences of changes in gel viscosity due to aging. It compares vaginal mucosal concentration distributions of drugs delivered by gels vs. intravaginal rings. Finally, the modeling approach is used to compare vaginal drug distributions across species with differing vaginal dimensions. Deterministic models of drug mass transport into and throughout the vaginal environment can provide critical insights about the mechanisms and determinants of such transport. This knowledge, and the methodology that obtains it, can be applied and translated to multiple applications, involving the scientific underpinnings of vaginal drug distribution and the performance evaluation and design of products, and their dosage regimens, that achieve it. PMID:25933938

  18. Distributed replica dynamics

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Chill, Samuel T.; Henkelman, Graeme

    2015-11-01

    A distributed replica dynamics (DRD) method is proposed to calculate rare-event molecular dynamics using distributed computational resources. Similar to Voter's parallel replica dynamics (PRD) method, the dynamics of independent replicas of the system are calculated on different computational clients. In DRD, each replica runs molecular dynamics from an initial state for a fixed simulation time and then reports information about the trajectory back to the server. A simulation clock on the server accumulates the simulation time of each replica until one reports a transition to a new state. Subsequent calculations are initiated from within this new state and the process is repeated to follow the state-to-state evolution of the system. DRD is designed to work with asynchronous and distributed computing resources in which the clients may not be able to communicate with each other. Additionally, clients can be added or removed from the simulation at any point in the calculation. Even with heterogeneous computing clients, we prove that the DRD method reproduces the correct probability distribution of escape times. We also show this correspondence numerically; molecular dynamics simulations of Al(100) adatom diffusion using PRD and DRD give consistent exponential distributions of escape times. Finally, we discuss guidelines for choosing the optimal number of replicas and replica trajectory length for the DRD method.

  19. Distributed replica dynamics.

    PubMed

    Zhang, Liang; Chill, Samuel T; Henkelman, Graeme

    2015-11-01

    A distributed replica dynamics (DRD) method is proposed to calculate rare-event molecular dynamics using distributed computational resources. Similar to Voter's parallel replica dynamics (PRD) method, the dynamics of independent replicas of the system are calculated on different computational clients. In DRD, each replica runs molecular dynamics from an initial state for a fixed simulation time and then reports information about the trajectory back to the server. A simulation clock on the server accumulates the simulation time of each replica until one reports a transition to a new state. Subsequent calculations are initiated from within this new state and the process is repeated to follow the state-to-state evolution of the system. DRD is designed to work with asynchronous and distributed computing resources in which the clients may not be able to communicate with each other. Additionally, clients can be added or removed from the simulation at any point in the calculation. Even with heterogeneous computing clients, we prove that the DRD method reproduces the correct probability distribution of escape times. We also show this correspondence numerically; molecular dynamics simulations of Al(100) adatom diffusion using PRD and DRD give consistent exponential distributions of escape times. Finally, we discuss guidelines for choosing the optimal number of replicas and replica trajectory length for the DRD method. PMID:26547163

  20. Sparse distributed memory overview

    NASA Technical Reports Server (NTRS)

    Raugh, Mike

    1990-01-01

    The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.

  1. Distributed Wind Market Applications

    SciTech Connect

    Forsyth, T.; Baring-Gould, I.

    2007-11-01

    Distributed wind energy systems provide clean, renewable power for on-site use and help relieve pressure on the power grid while providing jobs and contributing to energy security for homes, farms, schools, factories, private and public facilities, distribution utilities, and remote locations. America pioneered small wind technology in the 1920s, and it is the only renewable energy industry segment that the United States still dominates in technology, manufacturing, and world market share. The series of analyses covered by this report were conducted to assess some of the most likely ways that advanced wind turbines could be utilized apart from large, central station power systems. Each chapter represents a final report on specific market segments written by leading experts in this field. As such, this document does not speak with one voice but rather a compendium of different perspectives, which are documented from a variety of people in the U.S. distributed wind field.

  2. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne; Dunson, David

    2006-08-08

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  3. Distributed data transmitter

    DOEpatents

    Brown, Kenneth Dewayne; Dunson, David

    2008-06-03

    A distributed data transmitter (DTXR) which is an adaptive data communication microwave transmitter having a distributable architecture of modular components, and which incorporates both digital and microwave technology to provide substantial improvements in physical and operational flexibility. The DTXR has application in, for example, remote data acquisition involving the transmission of telemetry data across a wireless link, wherein the DTXR is integrated into and utilizes available space within a system (e.g., a flight vehicle). In a preferred embodiment, the DTXR broadly comprises a plurality of input interfaces; a data modulator; a power amplifier; and a power converter, all of which are modularly separate and distinct so as to be substantially independently physically distributable and positionable throughout the system wherever sufficient space is available.

  4. On exchangeable multinomial distributions

    PubMed Central

    George, E. Olusegun; Cheon, Kyeongmi; Yuan, Yilian; Szabo, Aniko

    2016-01-01

    We derive an expression for the joint distribution of exchangeable multinomial random variables, which generalizes the multinomial distribution based on independent trials while retaining some of its important properties. Unlike de Finneti's representation theorem for a binary sequence, the exchangeable multinomial distribution derived here does not require that the finite set of random variables under consideration be a subset of an infinite sequence. Using expressions for higher moments and correlations, we show that the covariance matrix for exchangeable multinomial data has a different form from that usually assumed in the literature, and we analyse data from developmental toxicology studies. The proposed analyses have been implemented in R and are available on CRAN in the CorrBin package.

  5. Partonic Transverse Momentum Distributions

    SciTech Connect

    Rossi, Patrizia

    2010-08-04

    In recent years parton distributions have been generalized to account also for transverse degrees of freedom and new sets of more general distributions, Transverse Momentum Dependent (TMD) parton distributions and fragmentation functions were introduced. Different experiments worldwide (HERMES, COMPASS, CLAS, JLab-Hall A) have measurements of TMDs in semi-inclusive DIS processes as one of their main focuses of research. TMD studies are also an important part of the present and future Drell-Yan experiments at RICH and JPARC and GSI, respectively, Studies of TMDs are also one of the main driving forces of the Jefferson Lab (JLab) 12 GeV upgrade project. Progress in phenomenology and theory is flourishing as well. In this talk an overview of the latest developments in studies of TMDs will be given and newly released results, ongoing activities, as well as planned near term and future measurements will be discussed.

  6. Mars elevation distribution

    NASA Technical Reports Server (NTRS)

    Wu, Sherman S. C.; Howington-Kraus, Annie E.; Ablin, Karyn K.

    1991-01-01

    A Digital Terrain Model (DTM) of Mars was derived with both Mercator and Sinusoidal Equal-Area projections from the global topographic map of Mars (scale 1:15 million, contour interval 1 km). Elevations on the map are referred to Mars' topographic datum that is defined by the gravity field at a 6.1-millibar pressure surface with respect to the center of mass of Mars. The DTM has a resolution at the equator of 1/59.226 degrees (exactly 1 km) per pixel. By using the DTM, the volumetric distribution of Mars topography above and below the datum has previously been calculated. Three types of elevation distributions of Mars' topography were calculated from the same DTM: (1) the frequency distribution of elevations at the pixel resolution; (2) average elevations in increments of 6 degrees in both longitude and latitude; and (3) average elevations in 36 separate blocks, each covering 30 degrees of latitude and 60 degrees of longitude.

  7. Discrete Pearson distributions

    SciTech Connect

    Bowman, K.O.; Shenton, L.R.; Kastenbaum, M.A.

    1991-11-01

    These distributions are generated by a first order recursive scheme which equates the ratio of successive probabilities to the ratio of two corresponding quadratics. The use of a linearized form of this model will produce equations in the unknowns matched by an appropriate set of moments (assumed to exist). Given the moments we may find valid solutions. These are two cases; (1) distributions defined on the non-negative integers (finite or infinite) and (2) distributions defined on negative integers as well. For (1), given the first four moments, it is possible to set this up as equations of finite or infinite degree in the probability of a zero occurrence, the sth component being a product of s ratios of linear forms in this probability in general. For (2) the equation for the zero probability is purely linear but may involve slowly converging series; here a particular case is the discrete normal. Regions of validity are being studied. 11 refs.

  8. High voltage distributed amplifier

    NASA Astrophysics Data System (ADS)

    Willems, D.; Bahl, I.; Wirsing, K.

    1991-12-01

    A high-voltage distributed amplifier implemented in GaAs MMIC technology has demonstrated good circuit performance over at least two octave bandwidth. This technique allows for very broadband amplifier operation with good efficiency in satellite, active-aperture radar, and battery-powered systems. Also, by increasing the number of FETs, the amplifier can be designed to match different voltage rails. The circuit does require a small amount of additional chip size over conventional distributed amplifiers but does not require power dividers or additional matching networks. This circuit configuration should find great use in broadband power amplifier design.

  9. Distributed Sensors Simulator

    SciTech Connect

    Brennan, Sean M.

    2003-08-30

    The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for distributed sensor networks without the commitment inherent in using hardware. The flexibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness, and scaling issues; explore arbitrary algorithms for DSNs; and is particularly useful as a proof-of-concept tool. The user provides data on node location and specifications, defines event phenomena, and plugs in the application(s) to run. DSS in turn provides the virtual environmental embedding — but exposed to the user like no true embedding could ever be.

  10. A distributed Tier-1

    NASA Astrophysics Data System (ADS)

    Fischer, L.; Grønager, M.; Kleist, J.; Smirnova, O.

    2008-07-01

    The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring.

  11. Input distributions for VISA

    SciTech Connect

    Liebetrau, A.M.

    1983-10-01

    Work is underway at Pacific Northwest Laboratory (PNL) to improve the probabilistic analysis used to model pressurized thermal shock (PTS) incidents in reactor pressure vessels, and, further, to incorporate these improvements into the existing Vessel Integrity Simulation Analysis (VISA) code. Two topics related to work on input distributions in VISA are discussed in this paper. The first involves the treatment of flaw size distributions and the second concerns errors in the parameters in the (Guthrie) equation which is used to compute ..delta..RT/sub NDT/, the shift in reference temperature for nil ductility transition.

  12. Distributed Sensors Simulator

    Energy Science and Technology Software Center (ESTSC)

    2003-08-30

    The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for distributed sensor networks without the commitment inherent in using hardware. The flexibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness, and scaling issues; explore arbitrary algorithms for DSNs; and is particularly useful as a proof-of-concept tool. The user provides data on node location and specifications, defines event phenomena, and plugs in the application(s)more » to run. DSS in turn provides the virtual environmental embedding — but exposed to the user like no true embedding could ever be.« less

  13. THERMAL DISTRIBUTION SYSTEM EXPERIMENT

    SciTech Connect

    KRAJEWSKI,R.F.; ANDREWS,J.W.; WEI,G.

    1999-09-01

    A laboratory experiment has been conducted which tests for the effects of distribution system purging on system Delivery Effectiveness (DE) as defined in ASHRAE 152P. The experiment is described in its configuration, instrumentation, and data acquisition system. Data gathered in the experiment is given and discussed. The results show that purging of the distribution system alone does not offer any improvement of the system DE. Additional supporting tests were conducted regarding experimental simulations of buffer zones and bare pipe and are also discussed.

  14. Galactic distribution of pulsars

    NASA Technical Reports Server (NTRS)

    Seiradakis, J. H.

    1977-01-01

    The density distributions of pulsars in luminosity, period, Z-distance, and galactocentric distance were derived, using a uniform sample of pulsars detected during a 408-MHz pulsar survey at Jodrell Bank. There are indications of a fine-scale structure in the spatial distributions and evidence that there is a general correlation with other galactic populations and the overall spiral structure. The electron layer in our galaxy is shown to be wider than the pulsar layer and uniform on a large scale. The number of pulsars in the galaxy has been estimated and used to derive the pulsar birthrate.

  15. Galactic distribution of pulsars

    NASA Technical Reports Server (NTRS)

    Seiradakis, J. H.

    1976-01-01

    The density distributions of pulsars in luminosity, period, Z-distance, and galactocentric distance were derived using a uniform sample of pulsars detected during a 408 MHz pulsar survey at Jodrell Bank. There are indications of a fine scale structure in the spatial distribution and evidence that there is a general correlation with other galactic populations and the overall spiral structure. The electron layer in the galaxy is shown to be wider than the pulsar layer and uniform on a large scale. The number of pulsars in the galaxy was estimated and used to derive the pulsar birthrate.

  16. Proximity Within Interphase Chromosome Contributes to the Breakpoint Distribution in Radiation-Induced Intrachromosomal Exchanges

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Uhlemeyer, Jimmy; Hada, Megumi; Asaithamby, A.; Chen, David J.; Wu, Honglu

    2015-01-01

    Previously, we reported that breaks involved in chromosome aberrations were clustered in several regions of chromosome3 in human mammary epithelial cells after exposures to either low-or high-LET radiation. In particular, breaks in certain regions of the chromosome tended to rejoin with each other to form an intrachromosome exchange event. This study tests the hypothesis that proximity within a single chromosome in interphase cell nuclei contributes to the distribution of radiation-induced chromosome breaks. Chromosome 3 in G1 human mammary epithelial cells was hybridized with the multicolor banding in situ hybridization (mBAND) probes that distinguish the chromosome in six differently colored regions, and the location of these regions was measured with a laser confocal microscope. Results of the study indicated that, on a multi-mega base pair scale of the DNA, the arrangement of chromatin was non-random. Both telomere regions tended to be located towards the exterior of the chromosome domain, whereas the centromere region towards the interior. In addition, the interior of the chromosome domain was preferentially occupied by the p-arm of the chromatin, which is consistent with our previous finding of intrachromosome exchanges involving breaks on the p-arm and in the centromere region of chromosome3. Other factors, such as the fragile sites in the 3p21 band and gene regulation, may also contribute to the breakpoint distribution in radiation-induced chromosome aberrations. Further investigations suggest that the 3D chromosome folding is cell type and culture condition dependent.

  17. Program for standard statistical distributions

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1972-01-01

    Development of procedure to describe frequency distributions involved in statistical theory is discussed. Representation of frequency distributions by first order differential equation is presented. Classification of various types of distributions based on Pearson parameters is analyzed.

  18. Relationship between the distribution pattern of right whales, Eubalaena glacialis, and satellite-derived sea surface thermal structure in the Great South Channel

    NASA Astrophysics Data System (ADS)

    Brown, C. W.; Winn, H. E.

    1989-03-01

    Right whales (Eubalaena glacialis) were sighted during random aerial transects over the Great South Channel region located between Georges Bank and Cape Cod in April to July in 1979-1981, 1984 and 1985. Sightings were superimposed on satellite AVHRR (Advanced Very High Resolution Radiometer) imagery of the same or approximate date to describe the whale's distribution pattern in relation to the thermal front, the 100 m isobath and sea surface temperature (SST) characteristics within the region. The majority of whales occurred north of the thermal front in the warmer, more stratified waters of the Gulf of Maine. Within the Great South Channel region, whales are not limited to a given surface isotherm. As would be expected from a stratified water mass, SST at whale sightings in the Gulf of Maine did not differ significantly from the median SST of those waters, and the horizontal SST gradient at whale sightings was not higher than background values. The SST did differ significantly from the median SST of the entire area sampled. Whales are distributed non-randomly about, and are in close proximity to, the 100 m isobath and the thermal front. The results indicate that whales were not found in areas where surface signatures of upwelling on or at spatial scales greater than 1 km 2 are present. The proximity of whale sightings to the isobath and the front suggests that frontal features and/or associated phenomena play an important role in the distribution pattern of right whales in the Great South Channel region.

  19. Distributed Continuous Registration.

    ERIC Educational Resources Information Center

    Myers, Donald L.

    1981-01-01

    The development, implementation, and features of Northern Colorado's continuous registration system are described. The system is an online distributed processing system, written in COBOL for an IBM Series I under the CPS operating system. Course selection, permit to enroll, and drop/add forms are provided. (Author/MLW)

  20. Enabling distributed petascale science.

    SciTech Connect

    Baranovski, A.; Bharathi, S.; Bresnahan, J.; chervenak, A.; Foster, I.; Fraser, D.; Freeman, T.; Gunter, D.; Jackson, K.; Keahey, K.; Kesselman, C.; Konerding, D. E.; Leroy, N.; Link, M.; Livny, M.; Miller, N.; Miller, R.; Oleynik, G.; Pearlman, L.; Schopf, J. M.; Schuler, R.; Tierney, B.; Mathematics and Computer Science; FNL; Univ. of Southern California; Univ. of Chicago; LBNL; Univ. of Wisconsin

    2007-01-01

    Petascale science is an end-to-end endeavour, involving not only the creation of massive datasets at supercomputers or experimental facilities, but the subsequent analysis of that data by a user community that may be distributed across many laboratories and universities. The new SciDAC Center for Enabling Distributed Petascale Science (CEDPS) is developing tools to support this end-to-end process. These tools include data placement services for the reliable, high-performance, secure, and policy-driven placement of data within a distributed science environment; tools and techniques for the construction, operation, and provisioning of scalable science services; and tools for the detection and diagnosis of failures in end-to-end data placement and distributed application hosting configurations. In each area, we build on a strong base of existing technology and have made useful progress in the first year of the project. For example, we have recently achieved order-of-magnitude improvements in transfer times (for lots of small files) and implemented asynchronous data staging capabilities; demonstrated dynamic deployment of complex application stacks for the STAR experiment; and designed and deployed end-to-end troubleshooting services. We look forward to working with SciDAC application and technology projects to realize the promise of petascale science.

  1. Aerosol distribution apparatus

    DOEpatents

    Hanson, W.D.

    An apparatus for uniformly distributing an aerosol to a plurality of filters mounted in a plenum, wherein the aerosol and air are forced through a manifold system by means of a jet pump and released into the plenum through orifices in the manifold. The apparatus allows for the simultaneous aerosol-testing of all the filters in the plenum.

  2. Distribution of Childrearing Demands.

    ERIC Educational Resources Information Center

    Zimmerman, Judith D.; And Others

    The tools of economic analysis were applied to demographic data in order to develop a social indicator measuring the extent of inequality in the distribution of childrearing responsibility in households from 1940 to 1980. With data drawn from the Current Population Survey of the Bureau of the Census, a "demand intensity" measure was developed.…

  3. Age Distribution of Groundwater

    NASA Astrophysics Data System (ADS)

    Morgenstern, U.; Daughney, C. J.

    2012-04-01

    Groundwater at the discharge point comprises a mixture of water from different flow lines with different travel time and therefore has no discrete age but an age distribution. The age distribution can be assessed by measuring how a pulse shaped tracer moves through the groundwater system. Detection of the time delay and the dispersion of the peak in the groundwater compared to the tracer input reveals the mean residence time and the mixing parameter. Tritium from nuclear weapons testing in the early 1960s resulted in a peak-shaped tritium input to the whole hydrologic system on earth. Tritium is the ideal tracer for groundwater because it is an isotope of hydrogen and therefore is part of the water molecule. Tritium time series data that encompass the passage of the bomb tritium pulse through the groundwater system in all common hydrogeologic situations in New Zealand demonstrate a semi-systematic pattern between age distribution parameters and hydrologic situation. The data in general indicate high fraction of mixing, but in some cases also indicate high piston flow. We will show that still, 45 years after the peak of the bomb tritium, it is possible to assess accurately the parameters of age distributions by measuring the tail of the bomb tritium.

  4. Distributive Education. Selling. Curriculum.

    ERIC Educational Resources Information Center

    Lankford, Dave; Comte, Don

    Nineteen lesson plans on selling are presented in this performance-based curriculum unit for distributive education. This unit is self-contained and consists of the following components: introduction (provides overview of unit content and describes why mastery of the objectives is important); performance objectives; pre-assessment instrument…

  5. Prototyping distributed simulation networks

    NASA Technical Reports Server (NTRS)

    Doubleday, Dennis L.

    1990-01-01

    Durra is a declarative language designed to support application-level programming. The use of Durra is illustrated to describe a simple distributed application: a simulation of a collection of networked vehicle simulators. It is shown how the language is used to describe the application, its components and structure, and how the runtime executive provides for the execution of the application.

  6. Distributed analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Dewhurst, A.; Legger, F.

    2015-12-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data is a challenging task for the distributed physics community. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are running daily on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We report on the impact such changes have on the DA infrastructure, describe the new DA components, and include recent performance measurements.

  7. Schooling and Income Distribution

    ERIC Educational Resources Information Center

    Marin, Alan; Psacharopoulos, George

    1976-01-01

    Analyzes the relationship between years of schooling and income distribution, based on human capital theory. (Available from North-Holland Publishing Company, P.O. Box 211, Amsterdam, the Netherlands; $13.50 annually, plus $4.00 postage and handling) (JG)

  8. Industrial power distribution

    SciTech Connect

    Sorrells, M.A.

    1990-01-01

    This paper is a broad overview of industrial power distribution. Primary focus will be on selection of the various low voltage components to achieve the end product. Emphasis will be on the use of national standards to ensure a safe and well designed installation.

  9. Distributed Information Management.

    ERIC Educational Resources Information Center

    Pottenger, William M.; Callahan, Miranda R.; Padgett, Michael A.

    2001-01-01

    Reviews the scope and effects of distributed information management. Discusses cultural and social influences, including library and Internet culture, information and knowledge, electronic libraries, and social aspects of libraries; digital collections; indexing; permanent link systems; metadata; the Open Archives initiative; digital object…

  10. Small School Distributive Education.

    ERIC Educational Resources Information Center

    Barnes, Bill

    Information on an atypical 1966-67 Distributive Education pilot program in New Mexico was given. The program was unique since one instructor conducted this program in two schools which were in separate rural districts (Dexter and Hagerman). Since both communities were primarily agricultural, with small student populations, the cost of such a…

  11. Multiagent distributed watershed management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Castelletti, A.; Amigoni, F.; Cai, X.

    2012-04-01

    Deregulation and democratization of water along with increasing environmental awareness are challenging integrated water resources planning and management worldwide. The traditional centralized approach to water management, as described in much of water resources literature, is often unfeasible in most of the modern social and institutional contexts. Thus it should be reconsidered from a more realistic and distributed perspective, in order to account for the presence of multiple and often independent Decision Makers (DMs) and many conflicting stakeholders. Game theory based approaches are often used to study these situations of conflict (Madani, 2010), but they are limited to a descriptive perspective. Multiagent systems (see Wooldridge, 2009), instead, seem to be a more suitable paradigm because they naturally allow to represent a set of self-interested agents (DMs and/or stakeholders) acting in a distributed decision process at the agent level, resulting in a promising compromise alternative between the ideal centralized solution and the actual uncoordinated practices. Casting a water management problem in a multiagent framework allows to exploit the techniques and methods that are already available in this field for solving distributed optimization problems. In particular, in Distributed Constraint Satisfaction Problems (DCSP, see Yokoo et al., 2000), each agent controls some variables according to his own utility function but has to satisfy inter-agent constraints; while in Distributed Constraint Optimization Problems (DCOP, see Modi et al., 2005), the problem is generalized by introducing a global objective function to be optimized that requires a coordination mechanism between the agents. In this work, we apply a DCSP-DCOP based approach to model a steady state hypothetical watershed management problem (Yang et al., 2009), involving several active human agents (i.e. agents who make decisions) and reactive ecological agents (i.e. agents representing

  12. Photovoltaics support distribution feeder

    SciTech Connect

    Barker, P.P.; Bailey, B.; Peterson, A.J. Jr.

    1997-03-01

    The concept of supporting the transmission and distribution (T&D) system with a photovoltaic (PV) distributed energy source has gained increasing attention as the cost of PV energy has declined. Locating a PV system at a strategic point on the distribution feeder can enhance the overall T&D system performance and provide a source of renewable power generation. In such applications, the PV system peak output ranges from a few percent up to about 20 percent of the peak feeder load. A good example of one such project on a line supplied by the Pacific Gas & Electric Co.`s Kerman Substation near Fresno, California. Given the success of this and other projects, Niagara Mohawk Power Corp. (NMPC) will be testing a 100 kW ac output system interconnected with a 13.2 kV distribution feeder to demonstrate PV T&D support concepts in its service territory. The demonstration system construction and operation is to be funded by NMPC, Utility Photovoltaics Group (UPVG) and New York State Energy Research and Development Authority (NYSERDA). AWS Scientific will manage the site construction and be responsible for maintaining, operating and monitoring the performance of the system. As a prerequisite to construction of the system, the NMPC research and development department funded AWS Scientific Inc. (Albany, N.Y.) and Power Technologies Inc. (Schenectady, N.Y.) to investigate the use of PV energy for T&D support applications on its system. The study involved reviewing a large number of distribution circuits throughout NMPC`s service territory to find candidate locations for the 100 kW demonstration project. A key focus of the study was to find a feeder whereby the injection of PV energy provided maximum dispersed generation benefits.

  13. Periodicity in the spatial-temporal earthquake distributions for the Pacific region: observation and modeling.

    NASA Astrophysics Data System (ADS)

    Sasorova, Elena; Levin, Boris

    2014-05-01

    In the course of the last century a cyclic increasing and decreasing of the Earth's seismic activity (SA) was marked. The variations of the SA for the events with M>=7.0 from 1900 up to date were under study. The two subsets of the worldwide NEIC (USGS) catalog were used: USGS/NEIC from 1973 to 2012 and catalog of the significant worldwide earthquakes (2150 B.C. - 1994 A.D.), compiled by USGS/NEIC from the NOAA agency. The preliminary standardization of magnitudes and elimination of aftershocks from list of events was performed. The entire period of observations was subdivided into 5-year intervals. The temporal distributions of the earthquake (EQ) density and released energy density were calculated separately for the Southern hemisphere (SH), and for the Northern hemisphere (NH) and for eighteen latitudinal belts: 90°-80°N, 80°-70°N, 70°-60°N, 60°-50°N and so on (the size of each belt is equal to 10°). The periods of the SA was compared for different latitudinal belts of the Earth. The peaks and decays of the seismicity do not coincide in time for different latitudinal belts and especially for the belts located in NH and SH. The peaks and decays of the SA for the events (with M>=8) were marked in the temporal distributions of the EQ for all studied latitudinal belts. The two-dimension distributions (over latitudes and over time) of the EQ density and released energy density highlighted that the periods of amplification of the SA are equal to 30-35 years approximately. Next, we check the existence of a non-random component in the EQ occurrence between the NH and the SH. All events were related to the time axis according to their origin time. We take into consideration the set of the EQs in the studied catalog as the sequence of events if each event may have only one of two possible outcome (occurrence in the NH or in the SH). A nonparametric run test was used for testing of hypothesis about an existence the nonrandom component in the examined sequence of

  14. Distributed Experiment Automation System

    NASA Astrophysics Data System (ADS)

    Lebedev, Gennadi

    2003-03-01

    Module based distributed system for controlling and automation scientific experiments were developed. System divides in five main layers: 1. Data processing and presentation modules, 2. Controllers - support primary command evaluation, data analysis and synchronization between Device Drivers. 3. Data Server. Provide real time data storage and management. 4. Device Drivers, support communication, preliminary signals acquisitions and control of peripheral devices. 5. Utility - batch processing, login, errors of execution handling, experimental data persistent storage and management, modules and devices monitoring, alarm state, remote components messaging and notification processing. System used networking (DCOM protocol) for communication between distributed modules. Configuration, modules parameters, data and commands links defined in scripting file (XML format). This modular structure allows great flexibility and extensibility as modules can be added and configured as required without any extensive programming.

  15. Bounding Species Distribution Models

    NASA Technical Reports Server (NTRS)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  16. Bounding species distribution models

    USGS Publications Warehouse

    Stohlgren, T.J.; Jarnevich, C.S.; Esaias, W.E.; Morisette, J.T.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used. ?? 2011 Current Zoology.

  17. Multipartite secure state distribution

    SciTech Connect

    Duer, W.; Briegel, H.-J.; Calsamiglia, J.

    2005-04-01

    We introduce the distribution of a secret multipartite entangled state in a real-world scenario as a quantum primitive. We show that in the presence of noisy quantum channels (and noisy control operations), any state chosen from the set of two-colorable graph states (Calderbank-Shor-Steane codewords) can be created with high fidelity while it remains unknown to all parties. This is accomplished by either blind multipartite entanglement purification, which we introduce in this paper, or by multipartite entanglement purification of enlarged states, which offers advantages over an alternative scheme based on standard channel purification and teleportation. The parties are thus provided with a secret resource of their choice for distributed secure applications.

  18. Distributed Agents for Autonomy

    NASA Astrophysics Data System (ADS)

    Blake, Rick; Amigoni, Francesco; Brambilla, Andrea; de la Rosa Steinz, Sonia; Lavagna, Michele; le Duc, Ian; Page, Jonathan; Page, Oliver; Steel, Robin; Wijnands, Quirien

    2010-08-01

    The Distributed Agents for Autonomy (DAFA) Study has been performed for ESA by SciSys UK Ltd, Vega GmbH and Politecnico di Milano. An analysis of past, present and future space missions has been conducted, structured around a set of three pre-defined mission scenarios: Formation Flying, Earth Observation and Planetary Exploration. This analysis led to the definition of a framework of use cases where the application of distributed autonomy seems necessary or appropriate, and a set of metrics that may be used to assess such deployments. Agent technology and architectures were extensively surveyed and the results used to elaborate each of the mission scenarios to the point where a software prototype could be constructed. Such a prototype was developed for a scenario based on the ExoMars mission and this has been used to highlight the advantages of a DAFA approach to the mission architecture.

  19. Towards heterogeneous distributed debugging

    SciTech Connect

    Damodaran-Kamal, S.K.

    1995-04-01

    Several years of research and development in parallel debugger design have given up several techniques, though implemented in a wide range of tools for an equally wide range of systems. This paper is an evaluation of these myriad techniques as applied to the design of a heterogeneous distributed debugger. The evaluation is based on what features users perceive as useful, as well as the ease of implementation of the features using the available technology. A preliminary architecture for such a heterogeneous tool is proposed. Our effort in this paper is significantly different from the other efforts at creating portable and heterogeneous distributed debuggers in that we concentrate on support for all the important issues in parallel debugging, instead of simply concentrating on portability and heterogeneity.

  20. Business size distributions

    NASA Astrophysics Data System (ADS)

    D'Hulst, R.; Rodgers, G. J.

    2001-10-01

    In a recent work, we introduced two models for the dynamics of customers trying to find the business that best corresponds to their expectation for the price of a commodity. In agreement with the empirical data, a power-law distribution for the business sizes was obtained, taking the number of customers of a business as a proxy for its size. Here, we extend one of our previous models in two different ways. First, we introduce a business aggregation rate that is fitness dependent, which allows us to reproduce a spread in empirical data from one country to another. Second, we allow the bankruptcy rate to take a different functional form, to be able to obtain a log-normal distribution with power-law tails for the size of the businesses.

  1. Properly Understanding the Impacts of Distributed Resources on Distribution Systems

    SciTech Connect

    Rizy, D Tom; Li, Fangxing; Li, Huijuan; Adhikari, Sarina; Kueck, John D

    2010-01-01

    The subject paper discusses important impacts of distributed resources on distribution networks and feeders. These include capacity, line losses, voltage regulation, and central system support (such as volt/var via central generators and substation) as the number, placement and penetration levels of distributed resources are varied. Typically, the impacts of distributed resources on the distribution system are studied by using steady-state rather than dynamic analysis tools. However, the response time and transient impacts of both system equipment (such as substation/feeder capacitors) and distributed resources needs to be taken into account and only dynamic analysis will provide the full impact results. ORNL is wrapping up a study of distributed resources interconnected to a large distribution system considering the above variables. A report of the study and its results will be condensed into a paper for this panel session. The impact of distributed resources will vary as the penetration level reaches the capacity of the distribution feeder/system. The question is how high of a penetration of distributed resource can be accommodated on the distribution feeder/system without any major changes to system operation, design and protection. The impacts most surely will vary depending upon load composition, distribution and level. Also, it is expected that various placement of distributed resources will impact the distribution system differently.

  2. Distributed Computerized Catalog System

    NASA Technical Reports Server (NTRS)

    Borgen, Richard L.; Wagner, David A.

    1995-01-01

    DarkStar Distributed Catalog System describes arbitrary data objects in unified manner, providing end users with versatile, yet simple search mechanism for locating and identifying objects. Provides built-in generic and dynamic graphical user interfaces. Design of system avoids some of problems of standard DBMS, and system provides more flexibility than do conventional relational data bases, or object-oriented data bases. Data-collection lattice partly hierarchical representation of relationships among collections, subcollections, and data objects.

  3. Distributed array radar

    NASA Astrophysics Data System (ADS)

    Heimiller, R. C.; Belyea, J. E.; Tomlinson, P. G.

    1983-11-01

    Distributed array radar (DAR) is a concept for efficiently accomplishing surveillance and tracking using coherently internetted mini-radars. They form a long baseline, very thinned array and are capable of very accurate location of targets. This paper describes the DAR concept. Factors involving two-way effective gain patterns for deterministic and random DAR arrays are analyzed and discussed. An analysis of factors affecting signal-to-noise ratio is presented and key technical and performance issues are briefly summarized.

  4. Symmetric generalized binomial distributions

    SciTech Connect

    Bergeron, H.; Curado, E. M. F.; Gazeau, J. P.; Rodrigues, Ligia M. C. S. E-mail: evaldo@cbpf.br E-mail: ligia@cbpf.br

    2013-12-15

    In two recent articles, we have examined a generalization of the binomial distribution associated with a sequence of positive numbers, involving asymmetric expressions of probabilities that break the symmetry win-loss. We present in this article another generalization (always associated with a sequence of positive numbers) that preserves the symmetry win-loss. This approach is also based on generating functions and presents constraints of non-negativeness, similar to those encountered in our previous articles.

  5. Distributed proximity sensor system

    NASA Technical Reports Server (NTRS)

    Lee, Sukhan (Inventor)

    1988-01-01

    The invention relates to sensors embedded on the surface of a robot hand, or other moving member. By distributing proximity sensors capable of detecting distances and angles to points on the surface of an object, information is obtained for achieving noncontacting shape and distance perception, i.e., for automatic determination of the object's shape, direction, and distance, as well as the orientation of the object relative to the robot hand or other moving member.

  6. Fiber distributed feedback laser

    NASA Technical Reports Server (NTRS)

    Elachi, C.; Evans, G. A.; Yeh, C. (Inventor)

    1976-01-01

    Utilizing round optical fibers as communication channels in optical communication networks presents the problem of obtaining a high efficiency coupling between the optical fiber and the laser. A laser is made an integral part of the optical fiber channel by either diffusing active material into the optical fiber or surrounding the optical fiber with the active material. Oscillation within the active medium to produce lasing action is established by grating the optical fiber so that distributed feedback occurs.

  7. A distributable APSE

    NASA Technical Reports Server (NTRS)

    Taft, Tucker, S.

    1986-01-01

    A distributed Ada program library is a key element in a distributed Ada Program Support Environment (APSE). To implement this successfully, the program library universe as defined by the Ada Reference Manual must be broken up into independently manageable pieces. This in turn requires the support of a distributed database system, as well as a mechanism for identifying compilation units, linkable subprograms, and Ada types in a decentralized way, to avoid falling victim to the bottlenecks of a global database and/or global unique-identifier manager. It was found that the ability to decentralize Ada program library activity is a major advantage in the management of large Ada programs. Currently, there are 18 resource-catalog revision sets, each in its own Host Interface (HIF) partition, plus 18 partitions for testing each of these, plus 11 partitions for the top-level compiler/linker/program library manager components. Compiling and other development work can proceed in parallel in each of these partitions, without suffering the performance bottlenecks of global locks or global unique-identifier generation.

  8. INFERRING THE ECCENTRICITY DISTRIBUTION

    SciTech Connect

    Hogg, David W.; Bovy, Jo; Myers, Adam D.

    2010-12-20

    Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here, we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementation of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision-other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, distances, or photometric redshifts-so long as the measurements have been communicated as a likelihood function or a posterior sampling.

  9. Inferring the Eccentricity Distribution

    NASA Astrophysics Data System (ADS)

    Hogg, David W.; Myers, Adam D.; Bovy, Jo

    2010-12-01

    Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here, we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementation of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision—other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, distances, or photometric redshifts—so long as the measurements have been communicated as a likelihood function or a posterior sampling.

  10. Distributed instruction set computer

    SciTech Connect

    Wang, L.

    1989-01-01

    The Distributed Instruction Set Computer, or DISC for short, is an experimental computer system for fine-grained parallel processing. DISC employs a new parallel instruction set, an Early Binding and Scheduling data tagging scheme, and a distributed control mechanism to explore a software dataflow control method in a multiple-functional unit system. With zero system control overhead, multiple instructions are executed in parallel and/or out of order at the highest speed of n instructions/cycle, where n is the number of functional units. The quantitative simulation result indicates that a DISC system with 16 functional units can deliverer a maximal 7.7X performance speedup over a single functional-unit system at the same clock speed. Exploring a new parallel instruction set and distributed control mechanism, DISC represents three major breakthroughs in the domain of fine-grained parallel processing: (1) Fast multiple instruction issuing mechanism; (2) Parallel and/or out-of-order execution; (3) Software dataflow control scheme.

  11. GASIFICATION FOR DISTRIBUTED GENERATION

    SciTech Connect

    Ronald C. Timpe; Michael D. Mann; Darren D. Schmidt

    2000-05-01

    A recent emphasis in gasification technology development has been directed toward reduced-scale gasifier systems for distributed generation at remote sites. The domestic distributed power generation market over the next decade is expected to be 5-6 gigawatts per year. The global increase is expected at 20 gigawatts over the next decade. The economics of gasification for distributed power generation are significantly improved when fuel transport is minimized. Until recently, gasification technology has been synonymous with coal conversion. Presently, however, interest centers on providing clean-burning fuel to remote sites that are not necessarily near coal supplies but have sufficient alternative carbonaceous material to feed a small gasifier. Gasifiers up to 50 MW are of current interest, with emphasis on those of 5-MW generating capacity. Internal combustion engines offer a more robust system for utilizing the fuel gas, while fuel cells and microturbines offer higher electric conversion efficiencies. The initial focus of this multiyear effort was on internal combustion engines and microturbines as more realistic near-term options for distributed generation. In this project, we studied emerging gasification technologies that can provide gas from regionally available feedstock as fuel to power generators under 30 MW in a distributed generation setting. Larger-scale gasification, primarily coal-fed, has been used commercially for more than 50 years to produce clean synthesis gas for the refining, chemical, and power industries. Commercial-scale gasification activities are under way at 113 sites in 22 countries in North and South America, Europe, Asia, Africa, and Australia, according to the Gasification Technologies Council. Gasification studies were carried out on alfalfa, black liquor (a high-sodium waste from the pulp industry), cow manure, and willow on the laboratory scale and on alfalfa, black liquor, and willow on the bench scale. Initial parametric tests

  12. Planning Systems for Distributed Operations

    NASA Technical Reports Server (NTRS)

    Maxwell, Theresa G.

    2002-01-01

    This viewgraph representation presents an overview of the mission planning process involving distributed operations (such as the International Space Station (ISS)) and the computer hardware and software systems needed to support such an effort. Topics considered include: evolution of distributed planning systems, ISS distributed planning, the Payload Planning System (PPS), future developments in distributed planning systems, Request Oriented Scheduling Engine (ROSE) and Next Generation distributed planning systems.

  13. Estimating Dark Matter Distributions

    NASA Astrophysics Data System (ADS)

    Wang, Xiao; Woodroofe, Michael; Walker, Matthew G.; Mateo, Mario; Olszewski, Edward

    2005-06-01

    Thanks to instrumental advances, new, very large kinematic data sets for nearby dwarf spheroidal (dSph) galaxies are on the horizon. A key aim of these data sets is to help determine the distribution of dark matter in these galaxies. Past analyses have generally relied on specific dynamical models or highly restrictive dynamical assumptions. We describe a new, nonparametric analysis of the kinematics of nearby dSph galaxies designed to take full advantage of the future large data sets. The method takes as input the projected positions and radial velocities of stars known to be members of the galaxies but does not use any parametric dynamical model or the assumption that the mass distribution follows that of the visible matter. The problem of estimating the radial mass distribution M(r) (the mass within the true radius r) is converted into a problem of estimating a regression function nonparametrically. From the Jeans equation we show that the unknown regression function is subject to fundamental shape restrictions, which we exploit in our analysis using statistical techniques borrowed from isotonic estimation and spline smoothing. Simulations indicate that M(r) can be estimated to within a factor of 2 or better with samples as small as 1000 stars over almost the entire radial range sampled by the kinematic data. The technique is applied to a sample of 181 stars in the Fornax dSph galaxy. We show that the galaxy contains a significant, extended dark halo some 10 times more massive than its baryonic component. Although applied here to dSph kinematics, this approach can be used in the analysis of any kinematically hot stellar system in which the radial velocity field is discretely sampled.

  14. Representation of orientation distributions

    SciTech Connect

    Wenk, H.R.; Kocks, U.F.

    1985-01-01

    This paper illustrates the principles presented with a particular experimental texture: from the surface layer of a copper polycrystal cold-rolled to 60% reduction in thickness. Four incomplete pole figures (200, 220, 222, and 113) were determined by x-ray diffraction in reflection geometry. The measured pole figures nearly exhibited orthorhombic symmetry (as expected), which was then strictly enforced by averaging the four quadrants of the pole figure. The orientation distribution function was obtained using the expansion in spherical harmonics (with only even-order coefficients up to l = 18).

  15. Distributed Optimization System

    DOEpatents

    Hurtado, John E.; Dohrmann, Clark R.; Robinett, III, Rush D.

    2004-11-30

    A search system and method for controlling multiple agents to optimize an objective using distributed sensing and cooperative control. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace. The objective can be: chemical sources, temperature sources, radiation sources, light sources, evaders, trespassers, explosive sources, time dependent sources, time independent sources, function surfaces, maximization points, minimization points, and optimal control of a system such as a communication system, an economy, a crane, and a multi-processor computer.

  16. Distribution of autumn-staging Lesser Snow Geese on the northeast coastal plain of Alaska [Distribución de chen caerulescens a través de su congregación otonal

    USGS Publications Warehouse

    Robertson, Donna G.; Brackney, Alan W.; Spindler, Michael A.; Hupp, Jerry W.

    1997-01-01

    We conducted aerial surveys of Lesser Snow Geese (Chen caerulescens caerulescens) during autumn staging on the coastal plain of the Arctic National Wildlife Refuge (ANWR) in northeast Alaska from late August through September, 1982 - 1993. We evaluated numbers and distribution of Snow Geese that staged on the ANWR, compared abundance of birds among 5 x 5-km cells used frequently (5 - 8 yr), periodically (3 - 4 yr), or infrequently (1 - 2 yr), and examined distribution changes within years. Maximum numbers of Snow Geese observed annually were highly variable (range 12,828 - 309,225). Snow Goose flocks occurred across 605,000 ha of the coastal plain, but used some areas more frequently than others. Frequently used cells (38 of 363 cells in the study area) were non-randomly distributed and primarily occurred on the central coastal plain between the wet coastal and steep foothills regions. Abundance of geese was greatest in frequently used, intermediate in periodically used, and lowest in infrequently used cells. Within years, Snow Goose numbers and flock locations varied between surveys, possibly because geese moved to different foraging areas during staging. The widespread distribution and annual variability in numbers of Snow Geese on the coastal plain was likely because birds used foraging habitats that were spatially and temporally heterogeneous. The ANWR coastal plain is an important component of the fall-staging area used by Snow Geese that nest in the western Canadian Arctic. Management decisions that affect the region should reflect its value to migrating Snow Geese.

  17. Distributed System Design Checklist

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin

    2014-01-01

    This report describes a design checklist targeted to fault-tolerant distributed electronic systems. Many of the questions and discussions in this checklist may be generally applicable to the development of any safety-critical system. However, the primary focus of this report covers the issues relating to distributed electronic system design. The questions that comprise this design checklist were created with the intent to stimulate system designers' thought processes in a way that hopefully helps them to establish a broader perspective from which they can assess the system's dependability and fault-tolerance mechanisms. While best effort was expended to make this checklist as comprehensive as possible, it is not (and cannot be) complete. Instead, we expect that this list of questions and the associated rationale for the questions will continue to evolve as lessons are learned and further knowledge is established. In this regard, it is our intent to post the questions of this checklist on a suitable public web-forum, such as the NASA DASHLink AFCS repository. From there, we hope that it can be updated, extended, and maintained after our initial research has been completed.

  18. PULSE AMPLITUDE DISTRIBUTION RECORDER

    DOEpatents

    Cowper, G.

    1958-08-12

    A device is described for automatica1ly recording pulse annplitude distribution received from a counter. The novelty of the device consists of the over-all arrangement of conventional circuit elements to provide an easy to read permanent record of the pulse amplitude distribution during a certain time period. In the device a pulse analyzer separates the pulses according to annplitude into several channels. A scaler in each channel counts the pulses and operates a pen marker positioned over a drivable recorder sheet. Since the scalers in each channel have the sanne capacity, the control circuitry permits counting of the incoming pulses until one scaler reaches capacity, whereupon the input is removed and an internal oscillator supplies the necessary pulses to fill up the other scalers. Movement of the chart sheet is initiated wben the first scaler reaches capacity to thereby give a series of marks at spacings proportional to the time required to fill the remaining scalers, and accessory equipment marks calibration points on the recorder sheet to facilitate direct reading of the number of external pulses supplied to each scaler.

  19. Vascular Distribution of Nanomaterials

    PubMed Central

    Stapleton, Phoebe A.; Nurkiewicz, Timothy R.

    2014-01-01

    Once considered primarily occupational, novel nanotechnology innovation and application has led to widespread domestic use and intentional biomedical exposures. With these exciting advances, the breadth and depth of toxicological considerations must also be expanded. The vascular system interacts with every tissue in the body, striving to homeostasis. Engineered nanomaterials (ENM) have been reported to distribute in many different organs and tissues. However, these observations have tended to use approaches requiring tissue homogenization and/or gross organ analyses. These techniques, while effective in establishing presence, preclude an exact determination of where ENM are deposited within a tissue. It is necessary to identify this exact distribution and deposition of ENM throughout the cardiovascular system, with respect to vascular hemodynamics and in vivo/ in vitro ENM modifications taken into account if nanotechnology is to achieve its full potential. Distinct levels of the vasculature will first be described as individual compartments. Then the vasculature will be considered as a whole. These unique compartments and biophysical conditions will be discussed in terms of their propensity to favor ENM deposition. Understanding levels of the vasculature will also be discussed. Ultimately, future studies must verify the mechanisms speculated on and presented herein. PMID:24777845

  20. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  1. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  2. Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Bodden, Lee; Pease, Phil; Bedet, Jean-Jacques; Rosen, Wayne

    1993-01-01

    The Goddard Space Flight Center Version 0 Distributed Active Archive Center (GSFC V0 DAAC) is being developed to enhance and improve scientific research and productivity by consolidating access to remote sensor earth science data in the pre-EOS time frame. In cooperation with scientists from the science labs at GSFC, other NASA facilities, universities, and other government agencies, the DAAC will support data acquisition, validation, archive and distribution. The DAAC is being developed in response to EOSDIS Project Functional Requirements as well as from requirements originating from individual science projects such as SeaWiFS, Meteor3/TOMS2, AVHRR Pathfinder, TOVS Pathfinder, and UARS. The GSFC V0 DAAC has begun operational support for the AVHRR Pathfinder (as of April, 1993), TOVS Pathfinder (as of July, 1993) and the UARS (September, 1993) Projects, and is preparing to provide operational support for SeaWiFS (August, 1994) data. The GSFC V0 DAAC has also incorporated the existing data, services, and functionality of the DAAC/Climate, DAAC/Land, and the Coastal Zone Color Scanner (CZCS) Systems.

  3. Distributed Operations Planning

    NASA Technical Reports Server (NTRS)

    Fox, Jason; Norris, Jeffrey; Powell, Mark; Rabe, Kenneth; Shams, Khawaja

    2007-01-01

    Maestro software provides a secure and distributed mission planning system for long-term missions in general, and the Mars Exploration Rover Mission (MER) specifically. Maestro, the successor to the Science Activity Planner, has a heavy emphasis on portability and distributed operations, and requires no data replication or expensive hardware, instead relying on a set of services functioning on JPL institutional servers. Maestro works on most current computers with network connections, including laptops. When browsing down-link data from a spacecraft, Maestro functions similarly to being on a Web browser. After authenticating the user, it connects to a database server to query an index of data products. It then contacts a Web server to download and display the actual data products. The software also includes collaboration support based upon a highly reliable messaging system. Modifications made to targets in one instance are quickly and securely transmitted to other instances of Maestro. The back end that has been developed for Maestro could benefit many future missions by reducing the cost of centralized operations system architecture.

  4. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  5. [Multiple-scale analysis on spatial distribution changes of forest carbon storage in Heilongjiang Province, Northeast China based on local statistics].

    PubMed

    Liu, Chang; Li, Feng-Ri; Jia, Wei-Wei; Zhen, Zhen

    2014-09-01

    Taking 4163 permanent sample plots from Chinese National Forest Inventory (CNFI) and key ecological benefit forest monitoring plots in Heilongjiang Province as basic data, and by using local Moran I and local statistics (local mean and local standard deviation), the spatial pattern, spatial variation and spatial autocorrelation of forest carbon storage in Heilongjiang Province with four bandwidths of 25, 50, 100 and 150 km were investigated, and the change in forest carbon storage across 2005 to 2010 was studied. The results showed that the spatial distribution of forest carbon storage in Heilongjiang Province had significantly positive spatial correlation, which indicated that the changes of carbon storage tended to be similar with their neighbors without a non-random manner. Forest carbon storage was affected by environmental factors, and the spatial heterogeneity strongly existed with a large variation in the study area. The spatial distribution of forest carbon storage was significantly different between 2005 and 2010 with an increasing trend. Local statistics are useful tools for characterizing forest carbon storage change across time and space, which are visualized by ArcGIS. PMID:25757297

  6. Distribution and moments of radial error. [Rayleigh distribution - random variables

    NASA Technical Reports Server (NTRS)

    White, R. G.

    1975-01-01

    An investigation of the moments and probability distribution of the resultant of two normally distributed random variables is presented. This is the so-called generalized Rayleigh distribution which has many applications in the study of wind shear, random noise, and radar. The most general formula was derived, and two special cases were considered for which tables of the moments and probability distribution functions are included as an appendix. One of the special cases was generalized to n-dimensions.

  7. Distributed Wind Energy in Idaho

    SciTech Connect

    Gardner, John; Johnson, Kathryn; Haynes, Todd; Seifert, Gary

    2009-01-31

    This project is a research and development program aimed at furthering distributed wind technology. In particular, this project addresses some of the barriers to distributed wind energy utilization in Idaho.

  8. Distributed charging of electrical assets

    DOEpatents

    Ghosh, Soumyadip; Phan, Dung; Sharma, Mayank; Wu, Chai Wah; Xiong, Jinjun

    2016-02-16

    The present disclosure relates generally to the field of distributed charging of electrical assets. In various examples, distributed charging of electrical assets may be implemented in the form of systems, methods and/or algorithms.

  9. DISTRIBUTED AMPLIFIER INCORPORATING FEEDBACK

    DOEpatents

    Bell, P.R. Jr.

    1958-10-21

    An improved distributed amplifier system employing feedback for stabilization is presented. In accordance with the disclosed invention, a signal to be amplified is applled to one end of a suitable terminated grid transmission line. At intervals along the transmission line, the signal is fed to stable, resistance-capacitance coupled amplifiers incorporating feedback loops therein. The output current from each amplifier is passed through an additional tube to minimize the electrostatic capacitance between the tube elements of the last stage of the amplifier, and fed to appropriate points on an output transmission line, similar to the grid line, but terminated at the opposite (input) end. The output taken from the unterminated end of the plate transmission line is proportional to the input voltage impressed upon the grid line.

  10. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  11. Unstructured quantum key distribution

    NASA Astrophysics Data System (ADS)

    Coles, Patrick; Metodiev, Eric; Lutkenhaus, Norbert

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with a high degree of symmetry, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. In this work, we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ``unstructured'' protocols, i.e., those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which dramatically reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown.

  12. Protocols for distributive scheduling

    NASA Technical Reports Server (NTRS)

    Richards, Stephen F.; Fox, Barry

    1993-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of space shuttle mission planning.

  13. Distributed road assessment system

    SciTech Connect

    Beer, N. Reginald; Paglieroni, David W

    2014-03-25

    A system that detects damage on or below the surface of a paved structure or pavement is provided. A distributed road assessment system includes road assessment pods and a road assessment server. Each road assessment pod includes a ground-penetrating radar antenna array and a detection system that detects road damage from the return signals as the vehicle on which the pod is mounted travels down a road. Each road assessment pod transmits to the road assessment server occurrence information describing each occurrence of road damage that is newly detected on a current scan of a road. The road assessment server maintains a road damage database of occurrence information describing the previously detected occurrences of road damage. After the road assessment server receives occurrence information for newly detected occurrences of road damage for a portion of a road, the road assessment server determines which newly detected occurrences correspond to which previously detected occurrences of road damage.

  14. Carotenoid Distribution in Nature.

    PubMed

    Alcaíno, Jennifer; Baeza, Marcelo; Cifuentes, Víctor

    2016-01-01

    Carotenoids are naturally occurring red, orange and yellow pigments that are synthesized by plants and some microorganisms and fulfill many important physiological functions. This chapter describes the distribution of carotenoid in microorganisms, including bacteria, archaea, microalgae, filamentous fungi and yeasts. We will also focus on their functional aspects and applications, such as their nutritional value, their benefits for human and animal health and their potential protection against free radicals. The central metabolic pathway leading to the synthesis of carotenoids is described as the three following principal steps: (i) the synthesis of isopentenyl pyrophosphate and the formation of dimethylallyl pyrophosphate, (ii) the synthesis of geranylgeranyl pyrophosphate and (iii) the synthesis of carotenoids per se, highlighting the differences that have been found in several carotenogenic organisms and providing an evolutionary perspective. Finally, as an example, the synthesis of the xanthophyll astaxanthin is discussed. PMID:27485217

  15. Sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Sparse distributed memory was proposed be Pentti Kanerva as a realizable architecture that could store large patterns and retrieve them based on partial matches with patterns representing current sensory inputs. This memory exhibits behaviors, both in theory and in experiment, that resemble those previously unapproached by machines - e.g., rapid recognition of faces or odors, discovery of new connections between seemingly unrelated ideas, continuation of a sequence of events when given a cue from the middle, knowing that one doesn't know, or getting stuck with an answer on the tip of one's tongue. These behaviors are now within reach of machines that can be incorporated into the computing systems of robots capable of seeing, talking, and manipulating. Kanerva's theory is a break with the Western rationalistic tradition, allowing a new interpretation of learning and cognition that respects biology and the mysteries of individual human beings.

  16. Nuclear Parton Distribution Functions

    SciTech Connect

    I. Schienbein, J.Y. Yu, C. Keppel, J.G. Morfin, F. Olness, J.F. Owens

    2009-06-01

    We study nuclear effects of charged current deep inelastic neutrino-iron scattering in the framework of a {chi}{sup 2} analysis of parton distribution functions (PDFs). We extract a set of iron PDFs which are used to compute x{sub Bj}-dependent and Q{sup 2}-dependent nuclear correction factors for iron structure functions which are required in global analyses of free nucleon PDFs. We compare our results with nuclear correction factors from neutrino-nucleus scattering models and correction factors for charged-lepton--iron scattering. We find that, except for very high x{sub Bj}, our correction factors differ in both shape and magnitude from the correction factors of the models and charged-lepton scattering.

  17. Distributed feedback lasers

    NASA Technical Reports Server (NTRS)

    Ladany, I.; Andrews, J. T.; Evans, G. A.

    1988-01-01

    A ridge waveguide distributed feedback laser was developed in InGaAsP. These devices have demonstrated CW output powers over 7 mW with threshold currents as low as 60 mA at 25 C. Measurements of the frequency response of these devices show a 3 dB bandwidth of about 2 GHz, which may be limited by the mount. The best devices have a single mode spectra over the entire temperature range tested with a side mode suppression of about 20 dB in both CW and pulsed modes. The design of this device, including detailed modeling of the ridge guide structure, effective index calculations, and a discussion of the grating configuration are presented. Also, the fabrication of the devices is presented in some detail, especially the fabrication of and subsequent growth over the grating. In addition, a high frequency fiber pigtailed package was designed and tested, which is a suitable prototype for a commercial package.

  18. Hail Size Distribution Mapping

    NASA Technical Reports Server (NTRS)

    2008-01-01

    A 3-D weather radar visualization software program was developed and implemented as part of an experimental Launch Pad 39 Hail Monitor System. 3DRadPlot, a radar plotting program, is one of several software modules that form building blocks of the hail data processing and analysis system (the complete software processing system under development). The spatial and temporal mapping algorithms were originally developed through research at the University of Central Florida, funded by NASA s Tropical Rainfall Measurement Mission (TRMM), where the goal was to merge National Weather Service (NWS) Next-Generation Weather Radar (NEXRAD) volume reflectivity data with drop size distribution data acquired from a cluster of raindrop disdrometers. In this current work, we adapted these algorithms to process data from a cluster of hail disdrometers positioned around Launch Pads 39A or 39B, along with the corresponding NWS radar data. Radar data from all NWS NEXRAD sites is archived at the National Climatic Data Center (NCDC). That data can be readily accessed at . 3DRadPlot plots Level III reflectivity data at four scan elevations (this software is available at Open Channel Software, ). By using spatial and temporal interpolation/extrapolation based on hydrometeor fall dynamics, we can merge the hail disdrometer array data coupled with local Weather Surveillance Radar-1988, Doppler (WSR-88D) radial velocity and reflectivity data into a 4-D (3-D space and time) picture of hail size distributions. Hail flux maps can then be generated and used for damage prediction and assessment over specific surfaces corresponding to structures within the disdrometer array volume. Immediately following a hail storm, specific damage areas and degree of damage can be identified for inspection crews.

  19. CMCC Data Distribution Centre

    NASA Astrophysics Data System (ADS)

    Aloisio, Giovanni; Fiore, Sandro; Negro, A.

    2010-05-01

    The CMCC Data Distribution Centre (DDC) is the primary entry point (web gateway) to the CMCC. It is a Data Grid Portal providing a ubiquitous and pervasive way to ease data publishing, climate metadata search, datasets discovery, metadata annotation, data access, data aggregation, sub-setting, etc. The grid portal security model includes the use of HTTPS protocol for secure communication with the client (based on X509v3 certificates that must be loaded into the browser) and secure cookies to establish and maintain user sessions. The CMCC DDC is now in a pre-production phase and it is currently used only by internal users (CMCC researchers and climate scientists). The most important component already available in the CMCC DDC is the Search Engine which allows users to perform, through web interfaces, distributed search and discovery activities by introducing one or more of the following search criteria: horizontal extent (which can be specified by interacting with a geographic map), vertical extent, temporal extent, keywords, topics, creation date, etc. By means of this page the user submits the first step of the query process on the metadata DB, then, she can choose one or more datasets retrieving and displaying the complete XML metadata description (from the browser). This way, the second step of the query process is carried out by accessing to a specific XML document of the metadata DB. Finally, through the web interface, the user can access to and download (partially or totally) the data stored on the storage device accessing to OPeNDAP servers and to other available grid storage interfaces. Requests concerning datasets stored in deep storage will be served asynchronously.

  20. The Saguaro distributed operating system

    NASA Astrophysics Data System (ADS)

    Andrews, Gregory R.; Schlichting, Richard D.

    1989-05-01

    The progress achieved over the final year of the Saguaro distributed operating system project is presented. The primary achievements were in related research, including SR distributed programming language, the MLP system for constructing distributed mixed-language programs, the Psync interprocess communication mechanism, a configurable operating system kernal called the x-kernal, and the development of language mechanisms for performing failure handling in distributed programming languages.

  1. Distributed transit compartments for arbitrary lifespan distributions in aging populations.

    PubMed

    Koch, Gilbert; Schropp, Johannes

    2015-09-01

    Transit compartment models (TCM) are often used to describe aging populations where every individual has its own lifespan. However, in the TCM approach these lifespans are gamma-distributed which is a serious limitation because often the Weibull or more complex distributions are realistic. Therefore, we extend the TCM concept to approximately describe any lifespan distribution and call this generalized concept distributed transit compartment models (DTCMs). The validity of DTCMs is obtained by convergence investigations. From the mechanistic perspective the transit rates are directly controlled by the lifespan distribution. Further, DTCMs could be used to approximate the convolution of a signal with a probability density function. As example a stimulatory effect of a drug in an aging population with a Weibull-distributed lifespan is presented where distribution and model parameters are estimated based on simulated data. PMID:26100181

  2. Correction of Distributed Optical Aberrations

    SciTech Connect

    Baker, K; Olivier, S; Carrano, C; Phillion, D

    2006-02-12

    The objective of this project was to demonstrate the use of multiple distributed deformable mirrors (DMs) to improve the performance of optical systems with distributed aberrations. This concept is expected to provide dramatic improvement in the optical performance of systems in applications where the aberrations are distributed along the optical path or within the instrument itself. Our approach used multiple actuated DMs distributed to match the aberration distribution. The project developed the algorithms necessary to determine the required corrections and simulate the performance of these multiple DM systems.

  3. Distributed visualization framework architecture

    NASA Astrophysics Data System (ADS)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this

  4. Vacillation Made Easy: Distribution, Re-distribution, and Un-distribution of DOPL-based Processing

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    1993-01-01

    Distributed Objects Protocol Layer, or DOPL, Provides a simple and general data communication abstraction that can support the distribution of C++ applications software functionality among an arbitrary collection of processors. The purposed of the abstraction is to minimize the cost of revising processing distribution decisions throughout the software development cycle, including after software has beed delivered to users.

  5. Intraplacental retinol distribution.

    PubMed

    Saunders, Cláudia; Leal, Maria Do Carmo; Flores, Hernando; Soares, Alexandre Gonçalves; De Lima, Ana Paula Pereira Thiapó; Leite, Paula Costa; Gomes, Mirian Martins; De Souza Júnior, Paulo Roberto Borges; Ramalho, Rejane Andréa

    2005-12-01

    With the objective of evaluating intraplacental vitamin A distribution, 234 placental samples were collected, corresponding to six samples from each of the placentas analyzed: two from the lateral maternal portion, one from the central maternal portion, two from the lateral fetal portion, and one from the central fetal portion. Samples were obtained from 39 adult puerperal mothers with low-risk pregnancies, without vitamin A deficiency or night blindness. Retinol content determination was achieved through spectrophotometry. Retinol values obtained for each region were correlated with the most probable value for each placenta (P < 0.001). Despite differences in retinol content between samples, statistical data analysis showed that intra-tissue variation had no influence on the conversion of data into information. Consequently, any portion of the placenta may be used for retinol level determination purposes, due to the correlation between all portions and the most probable value. The findings of the present study represent an advance for surveys intending to incorporate the collection and dosage of placental vitamin A levels into their analyses, thus increasing the arsenal of pre-pathological or subclinical vitamin A deficiency markers, which can allow for earlier intervention on the maternal-infant group. PMID:16638665

  6. The distribution sphere model

    SciTech Connect

    Myers, B.F.; Montgomery, F.C.; Morris, R.N.

    1993-08-01

    The equivalent sphere model, which is widely used in calculating the release of fission gases from nuclear fuel, is idealized. The model is based on the diffusion of fission products in and their escape from a homogeneous sphere of fuel; the fission products are generated at a constant rate and undergo radiodecay. The fuel is assumed to be a set of spherical particles with a common radius. The value of the radius is such that the surface-to-volume ratio, S/V, of the set of spherical particles is the same as the S/V of the fuel mass of interest. The release rate depends on the dimensionless quantity {lambda}a{sup 2}/D where {lambda} is the radiodecay constant, a, the equivalent sphere radius and D, the diffusion coefficient. In the limit {lambda}t {much_gt} 1, the steady-state fractional release for isotopes with half-lives less than about 5 d is given by the familiar relation R/B = 3{radical}D/{lambda}a{sup 2} (1). For the spherical particles, S/V = 3/a. However, in important cases, the assumption of a single value of a is inappropriate. Examples of configurations for which multiple values of a are appropriate include powders, hydrolyzed fuel kernels, normally configured HTR fuel particles and perhaps, fuel kernels alone. In the latter case, one can imagine a distribution of values of a whose mean yields the value appropriate for agreement of Eq. (1) with measurement.

  7. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  8. Fuel distribution valve

    SciTech Connect

    Halvorsen, R.M.; Hurst, J.B.

    1986-09-30

    This patent describes a fuel flow distribution valve for dividing and metering fuel flow from a fuel source to nozzles for supplying fuel to an engine comprising valve body means having an inlet and outlets, intermediate liner means forming a longitudinal valve bore in the valve body means. The intermediate liner means has a reference surface thereon, a valve slidably supported in the valve bore slidable longitudinally therein and having a close tolerance diametral fit therewith. The valve has a positioning surface engageable with the reference surface and movable to a spaced apart position therefrom, spring means for biasing the valve in a first direction with respect to the liner means to engage the positioning surface and reference surface. The valve also has a means for directing inlet pressure against the valve in opposition to the spring means, pairs of in line-machined flow metering ports in the liner means and value with the ports in each pair being congruent by virtue of being machined simultaneously with a common tool in the liner means and the valve when the valve is in a fixed position in the bore with the positioning surface spaced a preselected longitudinal distance from the reference surface to define spaced pairs of congruent flow metering ports.

  9. Distribution of contaminants

    SciTech Connect

    Dana, M.T.

    1980-01-01

    Current knowledge of the distribution of atmospheric contaminants is reviewed. Emphasis is placed on regional measurements (those made in areas largely unaffected by local sources). Three specific networks were discussed. The Electric Power Research Institute sponsored Sulfate Regional Experiment (SURE) and the Multi State Atmospheric Power Production Program Study (MAP3S) are networks with event sampling and focus on atmospheric research problems and model verification while the National Atmospheric Deposition Program (NADP) serves to monitor nationwide deposition and dustfall measurements. The MAP3S network was analyzed. No statistically significant trends in concentrations of acid precipitation related pollutants were obtained in the network wide data. Strong positive correlations between the concentrations of acid precipitation related pollutants were obtained from the inland northeast US sites. Midwestern and coastal sites had more complex chemistries which require further study. Several species exhibited seasonal variations: H and SO/sub 4/ had low winter and high summer concentrations; NH/sub 4/ exhibited less variation while NO/sub 3/ appeared constant throughout the year. As a result of differing seasonal trends, the NO/sub 3//SO/sub 4/ ratio varied from 0.3 in the summer to greater than 1 in the winter. 58 references. (MDF)

  10. Data distribution satellite

    NASA Technical Reports Server (NTRS)

    Price, Kent M.; Jorasch, Ronald E.; Wiskerchen, Michael J.

    1991-01-01

    A description is given of a data distribution satellite (DDS) system. The DDS would operate in conjunction with the tracking and data relay satellite system to give ground-based users real time, two-way access to instruments in space and space-gathered data. The scope of work includes the following: (1) user requirements are derived; (2) communication scenarios are synthesized; (3) system design constraints and projected technology availability are identified; (4) DDS communications payload configuration is derived, and the satellite is designed; (5) requirements for earth terminals and network control are given; (6) system costs are estimated, both life cycle costs and user fees; and (7) technology developments are recommended, and a technology development plan is given. The most important results obtained are as follows: (1) a satellite designed for launch in 2007 is feasible and has 10 Gb/s capacity, 5.5 kW power, and 2000 kg mass; (2) DDS features include on-board baseband switching, use of Ku- and Ka-bands, multiple optical intersatellite links; and (3) system user costs are competitive with projected terrestrial communication costs.

  11. Distributed ultrafast fibre laser

    PubMed Central

    Liu, Xueming; Cui, Yudong; Han, Dongdong; Yao, Xiankun; Sun, Zhipei

    2015-01-01

    A traditional ultrafast fibre laser has a constant cavity length that is independent of the pulse wavelength. The investigation of distributed ultrafast (DUF) lasers is conceptually and technically challenging and of great interest because the laser cavity length and fundamental cavity frequency are changeable based on the wavelength. Here, we propose and demonstrate a DUF fibre laser based on a linearly chirped fibre Bragg grating, where the total cavity length is linearly changeable as a function of the pulse wavelength. The spectral sidebands in DUF lasers are enhanced greatly, including the continuous-wave (CW) and pulse components. We observe that all sidebands of the pulse experience the same round-trip time although they have different round-trip distances and refractive indices. The pulse-shaping of the DUF laser is dominated by the dissipative processes in addition to the phase modulations, which makes our ultrafast laser simple and stable. This laser provides a simple, stable, low-cost, ultrafast-pulsed source with controllable and changeable cavity frequency. PMID:25765454

  12. Distributed Merge Trees

    SciTech Connect

    Morozov, Dmitriy; Weber, Gunther

    2013-01-08

    Improved simulations and sensors are producing datasets whose increasing complexity exhausts our ability to visualize and comprehend them directly. To cope with this problem, we can detect and extract significant features in the data and use them as the basis for subsequent analysis. Topological methods are valuable in this context because they provide robust and general feature definitions. As the growth of serial computational power has stalled, data analysis is becoming increasingly dependent on massively parallel machines. To satisfy the computational demand created by complex datasets, algorithms need to effectively utilize these computer architectures. The main strength of topological methods, their emphasis on global information, turns into an obstacle during parallelization. We present two approaches to alleviate this problem. We develop a distributed representation of the merge tree that avoids computing the global tree on a single processor and lets us parallelize subsequent queries. To account for the increasing number of cores per processor, we develop a new data structure that lets us take advantage of multiple shared-memory cores to parallelize the work on a single node. Finally, we present experiments that illustrate the strengths of our approach as well as help identify future challenges.

  13. Distributed Deliberative Recommender Systems

    NASA Astrophysics Data System (ADS)

    Recio-García, Juan A.; Díaz-Agudo, Belén; González-Sanz, Sergio; Sanchez, Lara Quijano

    Case-Based Reasoning (CBR) is one of most successful applied AI technologies of recent years. Although many CBR systems reason locally on a previous experience base to solve new problems, in this paper we focus on distributed retrieval processes working on a network of collaborating CBR systems. In such systems, each node in a network of CBR agents collaborates, arguments and counterarguments its local results with other nodes to improve the performance of the system's global response. We describe D2ISCO: a framework to design and implement deliberative and collaborative CBR systems that is integrated as a part of jcolibritwo an established framework in the CBR community. We apply D2ISCO to one particular simplified type of CBR systems: recommender systems. We perform a first case study for a collaborative music recommender system and present the results of an experiment of the accuracy of the system results using a fuzzy version of the argumentation system AMAL and a network topology based on a social network. Besides individual recommendation we also discuss how D2ISCO can be used to improve recommendations to groups and we present a second case of study based on the movie recommendation domain with heterogeneous groups according to the group personality composition and a group topology based on a social network.

  14. LHCb distributed conditions database

    NASA Astrophysics Data System (ADS)

    Clemencic, M.

    2008-07-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here.

  15. Voltage regulation in distribution networks with distributed generation

    NASA Astrophysics Data System (ADS)

    Blažič, B.; Uljanić, B.; Papič, I.

    2012-11-01

    The paper deals with the topic of voltage regulation in distribution networks with relatively high distributed energy resources (DER) penetration. The problem of voltage rise is described and different options for voltage regulation are given. The influence of DER on voltage profile and the effectiveness of the investigated solutions are evaluated by means of simulation in DIgSILENT. The simulated network is an actual distribution network in Slovenia with a relatively high penetration of distributed generation. Recommendations for voltage control in networks with DER penetration are given at the end.

  16. Constraining the double gluon distribution by the single gluon distribution

    NASA Astrophysics Data System (ADS)

    Golec-Biernat, Krzysztof; Lewandowska, Emilia; Serino, Mirko; Snyder, Zachary; Staśto, Anna M.

    2015-11-01

    We show how to consistently construct initial conditions for the QCD evolution equations for double parton distribution functions in the pure gluon case. We use to momentum sum rule for this purpose and a specific form of the known single gluon distribution function in the MSTW parameterization. The resulting double gluon distribution satisfies exactly the momentum sum rule and is parameter free. We also study numerically its evolution with a hard scale and show the approximate factorization into product of two single gluon distributions at small values of x, whereas at large values of x the factorization is always violated in agreement with the sum rule.

  17. Data distribution satellite

    NASA Technical Reports Server (NTRS)

    Stevens, Grady H.

    1992-01-01

    The Data Distribution Satellite (DDS), operating in conjunction with the planned space network, the National Research and Education Network and its commercial derivatives, would play a key role in networking the emerging supercomputing facilities, national archives, academic, industrial, and government institutions. Centrally located over the United States in geostationary orbit, DDS would carry sophisticated on-board switching and make use of advanced antennas to provide an array of special services. Institutions needing continuous high data rate service would be networked together by use of a microwave switching matrix and electronically steered hopping beams. Simultaneously, DDS would use other beams and on board processing to interconnect other institutions with lesser, low rate, intermittent needs. Dedicated links to White Sands and other facilities would enable direct access to space payloads and sensor data. Intersatellite links to a second generation ATDRS, called Advanced Space Data Acquisition and Communications System (ASDACS), would eliminate one satellite hop and enhance controllability of experimental payloads by reducing path delay. Similarly, direct access would be available to the supercomputing facilities and national data archives. Economies with DDS would be derived from its ability to switch high rate facilities amongst users needed. At the same time, having a CONUS view, DDS would interconnect with any institution regardless of how remote. Whether one needed high rate service or low rate service would be immaterial. With the capability to assign resources on demand, DDS will need only carry a portion of the resources needed if dedicated facilities were used. Efficiently switching resources to users as needed, DDS would become a very feasible spacecraft, even though it would tie together the space network, the terrestrial network, remote sites, 1000's of small users, and those few who need very large data links intermittently.

  18. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    Conroy, Michael; Mazzone, Rebecca; Little, William; Elfrey, Priscilla; Mann, David; Mabie, Kevin; Cuddy, Thomas; Loundermon, Mario; Spiker, Stephen; McArthur, Frank; Srey, Tate; Bonilla, Dennis

    2010-01-01

    The Distributed Observer network (DON) is a NASA-collaborative environment that leverages game technology to bring three-dimensional simulations to conventional desktop and laptop computers in order to allow teams of engineers working on design and operations, either individually or in groups, to view and collaborate on 3D representations of data generated by authoritative tools such as Delmia Envision, Pro/Engineer, or Maya. The DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3D visual environment. DON has been designed to enhance accessibility and user ability to observe and analyze visual simulations in real time. A variety of NASA mission segment simulations [Synergistic Engineering Environment (SEE) data, NASA Enterprise Visualization Analysis (NEVA) ground processing simulations, the DSS simulation for lunar operations, and the Johnson Space Center (JSC) TRICK tool for guidance, navigation, and control analysis] were experimented with. Desired functionalities, [i.e. Tivo-like functions, the capability to communicate textually or via Voice-over-Internet Protocol (VoIP) among team members, and the ability to write and save notes to be accessed later] were targeted. The resulting DON application was slated for early 2008 release to support simulation use for the Constellation Program and its teams. Those using the DON connect through a client that runs on their PC or Mac. This enables them to observe and analyze the simulation data as their schedule allows, and to review it as frequently as desired. DON team members can move freely within the virtual world. Preset camera points can be established, enabling team members to jump to specific views. This improves opportunities for shared analysis of options, design reviews, tests, operations, training, and evaluations, and improves prospects for verification of requirements, issues, and approaches among dispersed teams.

  19. A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact although a large body of empirical work shows that this is often not the case in nature. Models have been developed to model many non-random mating phenome...

  20. Distribution of tsunami interevent times

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2008-01-01

    The distribution of tsunami interevent times is analyzed using global and site-specific (Hilo, Hawaii) tsunami catalogs. An empirical probability density distribution is determined by binning the observed interevent times during a period in which the observation rate is approximately constant. The empirical distributions for both catalogs exhibit non-Poissonian behavior in which there is an abundance of short interevent times compared to an exponential distribution. Two types of statistical distributions are used to model this clustering behavior: (1) long-term clustering described by a universal scaling law, and (2) Omori law decay of aftershocks and triggered sources. The empirical and theoretical distributions all imply an increased hazard rate after a tsunami, followed by a gradual decrease with time approaching a constant hazard rate. Examination of tsunami sources suggests that many of the short interevent times are caused by triggered earthquakes, though the triggered events are not necessarily on the same fault.

  1. Distribution system harmonic filter planning

    SciTech Connect

    Ortmeyer, T.H.; Hiyama, Takashi

    1996-10-01

    A planning methodology for distribution system harmonic filtering is proposed. The method is intended for use on radial distribution systems with no large harmonic sources. It is proposed that 60 hertz var planning be done first to allocate the var resources. Following this process, the harmonic filter planning can be readily accomplished. Characteristics of the distribution systems and the harmonic sources are exploited to provide a practical filter planning technique which is effective and efficient.

  2. Audio distribution and Monitoring Circuit

    NASA Technical Reports Server (NTRS)

    Kirkland, J. M.

    1983-01-01

    Versatile circuit accepts and distributes TV audio signals. Three-meter audio distribution and monitoring circuit provides flexibility in monitoring, mixing, and distributing audio inputs and outputs at various signal and impedance levels. Program material is simultaneously monitored on three channels, or single-channel version built to monitor transmitted or received signal levels, drive speakers, interface to building communications, and drive long-line circuits.

  3. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  4. Distribution of Clokey's Eggvetch

    SciTech Connect

    David C. Anderson

    1998-12-01

    monophylla), Utah juniper (Juniperus osteosperma), and big sagebrush (Artemisia tridentata ssp. tridentata). Overall, the populations of Clokey's eggvetch on the NTS appear to be vigorous and do not appear threatened. It is estimated that there are approximately 2300 plants on the NTS. It should be considered as a species of concern because of its localized distribution, but it does not appear to warrant protection under the ESA.

  5. Recoverable distributed shared virtual memory

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent

    1990-01-01

    The problem of rollback recovery in distributed shared virtual environments, in which the shared memory is implemented in software in a loosely coupled distributed multicomputer system, is examined. A user-transparent checkpointing recovery scheme and a new twin-page disk storage management technique are presented for implementing recoverable distributed shared virtual memory. The checkpointing scheme can be integrated with the memory coherence protocol for managing the shared virtual memory. The twin-page disk design allows checkpointing to proceed in an incremental fashion without an explicit undo at the time of recovery. The recoverable distributed shared virtual memory allows the system to restart computation from a checkpoint without a global restart.

  6. Space platform utilities distribution study

    NASA Technical Reports Server (NTRS)

    Lefever, A. E.

    1980-01-01

    Generic concepts for the installation of power data and thermal fluid distribution lines on large space platforms were discussed. Connections with central utility subsystem modules and pallet interfaces were also considered. Three system concept study platforms were used as basepoints for the detail development. The tradeoff of high voltage low voltage power distribution and the impact of fiber optics as a data distribution mechanism were analyzed. Thermal expansion and temperature control of utility lines and ducts were considered. Technology developments required for implementation of the generic distribution concepts were identified.

  7. A four-way distribution amplifier for reference signal distribution

    NASA Technical Reports Server (NTRS)

    Lo, Y. V.

    1981-01-01

    A four way distribution amplifier with up to 100 dB isolation and with low phase noise of -140 dBc in a 1 Hz bandwidth 10 Hz from a 100 MHz signal was developed. It is to be used in the stabilized optical fiber distribution system to provide multiple outputs.

  8. Current Perspectives in Distributive Education.

    ERIC Educational Resources Information Center

    Klaurens, Mary K., Ed.; Trapnell, Gail, Ed.

    The volume on current perspectives in distributive education contains 29 individually authored articles organized into three sections. The first section on program conceptualization deals with the following subjects: the evolution of distributive education, program planning, advisory committees, placement services, postsecondary distributive…

  9. The Future of Distributed Leadership

    ERIC Educational Resources Information Center

    Gronn, Peter

    2008-01-01

    Purpose: This paper aims to assess the empirical utility and conceptual significance of distributed leadership. Design/methodology/approach: Three main sources of evidence are drawn on. The paper reviews some neglected commentary of an early generation of distributed leadership theorists. It also discusses a strand of social science writings on…

  10. Water Treatment Technology - Distribution Systems.

    ERIC Educational Resources Information Center

    Ross-Harrington, Melinda; Kincaid, G. David

    One of twelve water treatment technology units, this student manual on distribution systems provides instructional materials for six competencies. (The twelve units are designed for a continuing education training course for public water supply operators.) The competencies focus on the following areas: types of pipe for distribution systems, types…

  11. Reduplication and Distributivity in Kannada

    ERIC Educational Resources Information Center

    Anderson, Janet Katherine

    2012-01-01

    Reduplication of numerals and pronouns in Kannada is shown to be subject to locality conditions similar to those constraining binding. This dissertation explores an account of distributivity which exploits the similarity to binding, arguing that the source of the distributive reading in Numeral Reduplication is a bound element. [The dissertation…

  12. Leadership in Partially Distributed Teams

    ERIC Educational Resources Information Center

    Plotnick, Linda

    2009-01-01

    Inter-organizational collaboration is becoming more common. When organizations collaborate they often do so in partially distributed teams (PDTs). A PDT is a hybrid team that has at least one collocated subteam and at least two subteams that are geographically distributed and communicate primarily through electronic media. While PDTs share many…

  13. Workload Distribution among Agriculture Teachers

    ERIC Educational Resources Information Center

    Torres, Robert M.; Ulmer, Jonathan D.; Aschenbrener, Mollie S.

    2008-01-01

    Teachers distribute their time in many ways. The study sought to determine how agriculture teachers distribute their time among 11 selected teacher activities (i.e., preparation for instruction; classroom/laboratory teaching; laboratory preparation and/or maintenance; grading/scoring students' work; administrative duties-program management;…

  14. Distributed Leadership: Friend or Foe?

    ERIC Educational Resources Information Center

    Harris, Alma

    2013-01-01

    Distributed leadership is now widely known and variously enacted in schools and school systems. Distributed leadership implies a fundamental re-conceptualisation of leadership as practice and challenges conventional wisdom about the relationship between formal leadership and organisational performance. There has been much debate, speculation and…

  15. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2013-10-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as Cycle 20 proposal 13128.

  16. Quality monitored distributed voting system

    DOEpatents

    Skogmo, D.

    1997-03-18

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.

  17. Quality monitored distributed voting system

    DOEpatents

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  18. Quasistationary distributions for autocatalytic reactions

    SciTech Connect

    Parsons, R.W.; Pollett, P.K.

    1987-01-01

    The authors provide simple conditions for the existence of quasistationary distributions that can be used to describe the long-term behavior of open autocatalytic reaction systems. They illustrate with reference to a particular example that the quasistationary distribution is close to the usual stationary diffusion approximation.

  19. 2014 Distributed Wind Market Report

    SciTech Connect

    Orell, A.; Foster, N.

    2015-08-01

    The cover of the 2014 Distributed Wind Market Report.According to the 2014 Distributed Wind Market Report, distributed wind reached a cumulative capacity of almost 1 GW (906 MW) in the United States in 2014, reflecting nearly 74,000 wind turbines deployed across all 50 states, Puerto Rico, and the U.S. Virgin Islands. In total, 63.6 MW of new distributed wind capacity was added in 2014, representing nearly 1,700 units and $170 million in investment across 24 states. In 2014, America's distributed wind energy industry supported a growing domestic industrial base as exports from United States-based small wind turbine manufacturers accounted for nearly 80% of United States-based manufacturers' sales.

  20. Power Law Distribution in Education

    NASA Astrophysics Data System (ADS)

    Gupta, Hari M.; Campanha, José R.; Chavarette, Fábio R.

    We studied the statistical distribution of student's performance, which is measured through their marks, in university entrance examination (Vestibular) of UNESP (Universidade Estadual Paulista) with respect to (i) period of study-day versus night period (ii) teaching conditions - private versus public school (iii) economical conditions - high versus low family income. We observed long ubiquitous power law tails in physical and biological sciences in all cases. The mean value increases with better study conditions followed by better teaching and economical conditions. In humanities, the distribution is close to normal distribution with very small tail. This indicates that these power law tails in science subjects are due to the nature of the subjects themselves. Further and better study, teaching and economical conditions are more important for physical and biological sciences in comparison to humanities at this level of study. We explain these statistical distributions through Gradually Truncated Power law distributions. We discuss the possible reason for this peculiar behavior.

  1. Size distribution of ring polymers

    NASA Astrophysics Data System (ADS)

    Medalion, Shlomi; Aghion, Erez; Meirovitch, Hagai; Barkai, Eli; Kessler, David A.

    2016-06-01

    We present an exact solution for the distribution of sample averaged monomer to monomer distance of ring polymers. For non-interacting and local-interaction models these distributions correspond to the distribution of the area under the reflected Bessel bridge and the Bessel excursion respectively, and are shown to be identical in dimension d ≥ 2, albeit with pronounced finite size effects at the critical dimension, d = 2. A symmetry of the problem reveals that dimension d and 4 ‑ d are equivalent, thus the celebrated Airy distribution describing the areal distribution of the d = 1 Brownian excursion describes also a polymer in three dimensions. For a self-avoiding polymer in dimension d we find numerically that the fluctuations of the scaled averaged distance are nearly identical in dimension d = 2, 3 and are well described to a first approximation by the non-interacting excursion model in dimension 5.

  2. Distribution System Voltage Regulation by Distributed Energy Resources

    SciTech Connect

    Ceylan, Oguzhan; Liu, Guodong; Xu, Yan; Tomsovic, Kevin

    2014-01-01

    This paper proposes a control method to regulate voltages in 3 phase unbalanced electrical distribution systems. A constrained optimization problem to minimize voltage deviations and maximize distributed energy resource (DER) active power output is solved by harmony search algorithm. IEEE 13 Bus Distribution Test System was modified to test three different cases: a) only voltage regulator controlled system b) only DER controlled system and c) both voltage regulator and DER controlled system. The simulation results show that systems with both voltage regulators and DER control provide better voltage profile.

  3. Genomic patterns associated with paternal/maternal distribution of transposable elements

    NASA Astrophysics Data System (ADS)

    Jurka, Jerzy

    2003-03-01

    Transposable elements (TEs) are specialized DNA or RNA fragments capable of surviving in intragenomic niches. They are commonly, perhaps unjustifiably referred to as "selfish" or "parasitic" elements. TEs can be divided in two major classes: retroelements and DNA transposons. The former include non-LTR retrotransposons and retrovirus-like elements, using reverse transriptase for their reproduction prior to integration into host DNA. The latter depend mostly on host DNA replication, with possible exception of rolling-circle transposons recently discovered by our team. I will review basic information on TEs, with emphasis on human Alu and L1 retroelements discussed in the context of genomic organization. TEs are non-randomly distributed in chromosomal DNA. In particular, human Alu elements tend to prefer GC-rich regions, whereas L1 accumulate in AT-rich regions. Current explanations of this phenomenon focus on the so called "target effects" and post-insertional selection. However, the proposed models appear to be unsatisfactory and alternative explanations invoking "channeling" to different chromosomal regions will be a major focus of my presentation. Transposable elements (TEs) can be expressed and integrated into host DNA in the male or female germlines, or both. Different models of expression and integration imply different proportions of TEs on sex chromosomes and autosomes. The density of recently retroposed human Alu elements is around three times higher on chromosome Y than on chromosome X, and over two times higher than the average density for all human autosomes. This implies Alu activity in paternal germlines. Analogous inter-chromosomal proportions for other repeat families should determine their compatibility with one of the three basic models describing the inheritance of TEs. Published evidence indicates that maternally and paternally imprinted genes roughly correspond to GC-rich and AT-rich DNA. This may explain the observed chromosomal distribution of

  4. Marketing and Distribution: Developing Career Interests in Distributive Education

    ERIC Educational Resources Information Center

    Searle, A. Gary

    1978-01-01

    The author discusses a variety of commercial interest inventories which may be used by the distributive education teacher-coordinator to guide students in exploring careers in the marketing cluster. ( MF)

  5. Distribution and geological control of mud volcanoes and other fluid/free gas seepage features in the Mediterranean Sea and nearby Gulf of Cadiz

    NASA Astrophysics Data System (ADS)

    Mascle, Jean; Mary, Flore; Praeg, Daniel; Brosolo, Laetitia; Camera, Laurent; Ceramicola, Silvia; Dupré, Stéphanie

    2014-06-01

    Existing knowledge on the distribution of mud volcanoes (MVs) and other significant fluid/free gas-venting features (mud cones, mud pies, mud-brine pools, mud carbonate cones, gas chimneys and, in some cases, pockmark fields) discovered on the seafloor of the Mediterranean Sea and in the nearby Gulf of Cadiz has been compiled using regional geophysical information (including multibeam coverage of most deepwater areas). The resulting dataset comprises both features proven from geological sampling, or in situ observations, and many previously unrecognized MVs inferred from geophysical evidence. The synthesis reveals that MVs clearly have non-random distributions that correspond to two main geodynamic settings: (1) the vast majority occur along the various tectono-sedimentary accretionary wedges of the Africa-Eurasia subduction zone, particularly in the central and eastern Mediterranean basins (external Calabrian Arc, Mediterranean Ridge, Florence Rise) but also along its westernmost boundary in the Gulf of Cadiz; (2) other MVs characterize thick depocentres along parts of the Mesozoic passive continental margins that border Africa from eastern Tunisia to the Levantine coasts, particularly off Egypt and, locally, within some areas of the western Mediterranean back-arc basins. Meaningfully accounting for MV distribution necessitates evidence of overpressured fluids and mud-rich layers. In addition, cross-correlations between MVs and other GIS-based data, such as maps of the Messinian evaporite basins and/or active (or recently active) tectonic trends, stress the importance of assessing geological control in terms of the presence, or not, of thick seals and potential conduits. It is contended that new MV discoveries may be expected in the study region, particularly along the southern Ionian Sea continental margins.

  6. Lunar soil grain size distribution

    NASA Technical Reports Server (NTRS)

    Carrier, W. D., III

    1973-01-01

    A comprehensive review has been made of the currently available data for lunar grain size distributions. It has been concluded that there is little or no statistical difference among the large majority of the soil samples from the Apollo 11, 12, 14, and 15 missions. The grain size distribution for these soils has reached a steady state in which the comminution processes are balanced by the aggregation processes. The median particle size for the steady-state soil is 40 to 130 microns. The predictions of lunar grain size distributions based on the Surveyor television photographs have been found to be quantitatively in error and qualitatively misleading.

  7. Packing fraction of continuous distributions

    NASA Astrophysics Data System (ADS)

    Brouwers, Jos

    2014-03-01

    This study addresses the packing and void fraction of polydisperse particles with geometric and lognormal size distribution. It is demonstrated that a bimodal discrete particle distribution can be transformed into said continuous particle-size distributions. Furthermore, original and exact expressions are presented that predict the packing fraction of these particle assemblies. For a number of particle shapes and their packing modes (close, loose) the applicable parameters are given. The closed-form analytical expression governing the packing fractions are thoroughly compared with empirical and computational data reported in the literature, and good agreement is found.

  8. Valence quark spin distribution functions

    SciTech Connect

    Nathan Isgur

    1998-09-01

    The hyperfine interactions of the constituent quark model provide a natural explanation for many nucleon properties, including the {Delta} - N splitting, the charge radius of the neutron, and the observation that the proton's quark distribution function ratio d(x)/u(x) {r_arrow} 0 as x {r_arrow} 1. The hyperfine-perturbed quark model also makes predictions for the nucleon spin-dependent distribution functions. Precision measurements of the resulting asymmetries A{sub 1}{sup p}(x) and A{sub 1}{sup n}(x) in the valence region can test this model and thereby the hypothesis that the valence quark spin distributions are ''normal''.

  9. Exploiting replication in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, T. A.

    1989-01-01

    Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.

  10. A prototype Distributed Audit System

    SciTech Connect

    Banning, D.L.

    1993-08-01

    Security auditing systems are used to detect and assess unauthorized or abusive system usage. Historically, security audits were confined to a single computer system. Recent work examines ways of extending auditing to include heterogeneous groups of computers (distributed system). This paper describes the design and prototype development of a Distributed Audit System (DAS) which was developed with funding received from Lawrence Livermore Laboratory and through the Master`s thesis effort performed by the author at California State University, Long Beach. The DAS is intended to provide collection, transfer, and control of audit data on distributed, heterogeneous hosts.

  11. The alignment-distribution graph

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert

    1993-01-01

    Implementing a data-parallel language such as Fortran 90 on a distributed-memory parallel computer requires distributing aggregate data objects (such as arrays) among the memory modules attached to the processors. The mapping of objects to the machine determines the amount of residual communication needed to bring operands of parallel operations into alignment with each other. We present a program representation called the alignment distribution graph that makes these communication requirements explicit. We describe the details of the representation, show how to model communication cost in this framework, and outline several algorithms for determining object mappings that approximately minimize residual communication.

  12. The alignment-distribution graph

    NASA Technical Reports Server (NTRS)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert

    1993-01-01

    Implementing a data-parallel language such as Fortran 90 on a distributed-memory parallel computer requires distributing aggregate data objects (such as arrays) among the memory modules attached to the processors. The mapping of objects to the machine determines the amount of residual communication needed to bring operands of parallel operations into alignment with each other. We present a program representation called the alignment-distribution graph that makes these communication requirements explicit. We describe the details of the representation, show how to model communication cost in this framework, and outline several algorithms for determining object mappings that approximately minimize residual communication.

  13. Wealth distribution on complex networks

    NASA Astrophysics Data System (ADS)

    Ichinomiya, Takashi

    2012-12-01

    We study the wealth distribution of the Bouchaud-Mézard model on complex networks. It is known from numerical simulations that this distribution depends on the topology of the network; however, no one has succeeded in explaining it. Using “adiabatic” and “independent” assumptions along with the central-limit theorem, we derive equations that determine the probability distribution function. The results are compared to those of simulations for various networks. We find good agreement between our theory and the simulations, except for the case of Watts-Strogatz networks with a low rewiring rate due to the breakdown of independent assumption.

  14. Dose distributions in regions containing beta sources: Irregularly shaped source distributions in homogeneous media

    SciTech Connect

    Werner, B.L. )

    1991-11-01

    Methods are introduced by which dose rate distributions due to nonuniform, irregularly shaped distributions of beta emitters can be calculated using dose rate distributions for uniform, spherical source distributions. The dose rate distributions can be written in the MIRD formalism.

  15. The Raid distributed database system

    NASA Technical Reports Server (NTRS)

    Bhargava, Bharat; Riedl, John

    1989-01-01

    Raid, a robust and adaptable distributed database system for transaction processing (TP), is described. Raid is a message-passing system, with server processes on each site to manage concurrent processing, consistent replicated copies during site failures, and atomic distributed commitment. A high-level layered communications package provides a clean location-independent interface between servers. The latest design of the package delivers messages via shared memory in a configuration with several servers linked into a single process. Raid provides the infrastructure to investigate various methods for supporting reliable distributed TP. Measurements on TP and server CPU time are presented, along with data from experiments on communications software, consistent replicated copy control during site failures, and concurrent distributed checkpointing. A software tool for evaluating the implementation of TP algorithms in an operating-system kernel is proposed.

  16. BESIII production with distributed computing

    NASA Astrophysics Data System (ADS)

    Zhang, X. M.; Yan, T.; Zhao, X. H.; Ma, Z. T.; Yan, X. F.; Lin, T.; Deng, Z. Y.; Li, W. D.; Belov, S.; Pelevanyuk, I.; Zhemchugov, A.; Cai, H.

    2015-12-01

    Distributed computing is necessary nowadays for high energy physics experiments to organize heterogeneous computing resources all over the world to process enormous amounts of data. The BESIII experiment in China, has established its own distributed computing system, based on DIRAC, as a supplement to local clusters, collecting cluster, grid, desktop and cloud resources from collaborating member institutes around the world. The system consists of workload management and data management to deal with the BESIII Monte Carlo production workflow in a distributed environment. A dataset-based data transfer system has been developed to support data movements among sites. File and metadata management tools and a job submission frontend have been developed to provide a virtual layer for BESIII physicists to use distributed resources. Moreover, the paper shows the experience to cope with lack of grid experience and low manpower among the BESIII community.

  17. Fact Program - distributed exhaust nozzle

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Futuristic Airframe Concepts & Technology (FACT): Distributed exhaust nozzle mounted in the Low Speed Aeroacoustic Wind Tunnel. Angle is zero degrees with respect to microphones. Photographed in the Low Speed Aeroacoustic Wind Tunnel, Jet Noise Lab, building 1221-A.

  18. Multiple complementary gas distribution assemblies

    DOEpatents

    Ng, Tuoh-Bin; Melnik, Yuriy; Pang, Lily L; Tuncel, Eda; Nguyen, Son T; Chen, Lu

    2016-04-05

    In one embodiment, an apparatus includes a first gas distribution assembly that includes a first gas passage for introducing a first process gas into a second gas passage that introduces the first process gas into a processing chamber and a second gas distribution assembly that includes a third gas passage for introducing a second process gas into a fourth gas passage that introduces the second process gas into the processing chamber. The first and second gas distribution assemblies are each adapted to be coupled to at least one chamber wall of the processing chamber. The first gas passage is shaped as a first ring positioned within the processing chamber above the second gas passage that is shaped as a second ring positioned within the processing chamber. The gas distribution assemblies may be designed to have complementary characteristic radial film growth rate profiles.

  19. Distribution and flux of micrometeoroids

    NASA Technical Reports Server (NTRS)

    Morrison, D. A.; Zinner, E.

    1977-01-01

    The mass distribution, flux, and distribution in space of the micrometeoroid complex at 1 AU are estimated on the basis of data from Apollo 17 rocks and recent calibrations of solar-flare track-production rates. It is found that the size frequency distribution of microcraters on lunar rocks suggests a bimodal mass distribution of micrometeoroids, but the precise form of the curve requires further definition, particularly insofar as the degree of depletion of particles producing craters 10 to 100 microns in diameter is concerned. Variations in slope with crater-diameter or particle-mass increments are shown to indicate that different processes affect one or more particle populations. Fluxes corresponding to varied lunar surface orientation and residence time are calculated, but no striking difference is observed between the flux of submicron-diameter particles with orbits in the plane of the ecliptic and fluxes of particles with orbits normal to the plane in the solar apex direction.

  20. 2013 Distributed Wind Market Report

    SciTech Connect

    Orrell, Alice C.; Rhoads-Weaver, H. E.; Flowers, Larry T.; Gagne, Matthew N.; Pro, Boyd H.; Foster, Nikolas AF

    2014-08-20

    The purpose of this report is to quantify and summarize the 2013 U.S. distributed wind market to help plan and guide future investments and decisions by industry stakeholders, utilities, state and federal agencies, and other interested parties.

  1. The Binomial Distribution in Shooting

    ERIC Educational Resources Information Center

    Chalikias, Miltiadis S.

    2009-01-01

    The binomial distribution is used to predict the winner of the 49th International Shooting Sport Federation World Championship in double trap shooting held in 2006 in Zagreb, Croatia. The outcome of the competition was definitely unexpected.

  2. Performance of distributed multiscale simulations

    PubMed Central

    Borgdorff, J.; Ben Belgacem, M.; Bona-Casas, C.; Fazendeiro, L.; Groen, D.; Hoenen, O.; Mizeranschi, A.; Suter, J. L.; Coster, D.; Coveney, P. V.; Dubitzky, W.; Hoekstra, A. G.; Strand, P.; Chopard, B.

    2014-01-01

    Multiscale simulations model phenomena across natural scales using monolithic or component-based code, running on local or distributed resources. In this work, we investigate the performance of distributed multiscale computing of component-based models, guided by six multiscale applications with different characteristics and from several disciplines. Three modes of distributed multiscale computing are identified: supplementing local dependencies with large-scale resources, load distribution over multiple resources, and load balancing of small- and large-scale resources. We find that the first mode has the apparent benefit of increasing simulation speed, and the second mode can increase simulation speed if local resources are limited. Depending on resource reservation and model coupling topology, the third mode may result in a reduction of resource consumption. PMID:24982258

  3. Visualizing Spatially Varying Distribution Data

    NASA Technical Reports Server (NTRS)

    Kao, David; Luo, Alison; Dungan, Jennifer L.; Pang, Alex; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Box plot is a compact representation that encodes the minimum, maximum, mean, median, and quarters information of a distribution. In practice, a single box plot is drawn for each variable of interest. With the advent of more accessible computing power, we are now facing the problem of visual icing data where there is a distribution at each 2D spatial location. Simply extending the box plot technique to distributions over 2D domain is not straightforward. One challenge is reducing the visual clutter if a box plot is drawn over each grid location in the 2D domain. This paper presents and discusses two general approaches, using parametric statistics and shape descriptors, to present 2D distribution data sets. Both approaches provide additional insights compared to the traditional box plot technique

  4. Spatial Distribution of Dominant Arboreal Ants in a Malagasy Coastal Rainforest: Gaps and Presence of an Invasive Species

    PubMed Central

    Dejean, Alain; Fisher, Brian L.; Corbara, Bruno; Rarevohitra, Raymond; Randrianaivo, Richard; Rajemison, Balsama; Leponce, Maurice

    2010-01-01

    We conducted a survey along three belt transects located at increasing distances from the coast to determine whether a non-random arboreal ant assemblage, such as an ant mosaic, exists in the rainforest on the Masoala Peninsula, Madagascar. In most tropical rainforests, very populous colonies of territorially dominant arboreal ant species defend absolute territories distributed in a mosaic pattern. Among the 29 ant species recorded, only nine had colonies large enough to be considered potentially territorially dominant; the remaining species had smaller colonies and were considered non-dominant. Nevertheless, the null-model analyses used to examine the spatial structure of their assemblages did not reveal the existence of an ant mosaic. Inland, up to 44% of the trees were devoid of dominant arboreal ants, something not reported in other studies. While two Crematogaster species were not associated with one another, Brachymyrmex cordemoyi was positively associated with Technomyrmex albipes, which is considered an invasive species—a non-indigenous species that has an adverse ecological effect on the habitats it invades. The latter two species and Crematogaster ranavalonae were mutually exclusive. On the other hand, all of the trees in the coastal transect and at least 4 km of coast were occupied by T. albipes, and were interconnected by columns of workers. Technomyrmex albipes workers collected from different trees did not attack each other during confrontation tests, indicating that this species has formed a supercolony along the coast. Yet interspecific aggressiveness did occur between T. albipes and Crematogaster ranavalonae, a native species which is likely territorially dominant based on our intraspecific confrontation tests. These results suggest that the Masoala rainforest is threatened by a potential invasion by T. albipes, and that the penetration of this species further inland might be facilitated by the low density of native, territorially dominant

  5. Where to nest? Ecological determinants of chimpanzee nest abundance and distribution at the habitat and tree species scale.

    PubMed

    Carvalho, Joana S; Meyer, Christoph F J; Vicente, Luis; Marques, Tiago A

    2015-02-01

    Conversion of forests to anthropogenic land-uses increasingly subjects chimpanzee populations to habitat changes and concomitant alterations in the plant resources available to them for nesting and feeding. Based on nest count surveys conducted during the dry season, we investigated nest tree species selection and the effect of vegetation attributes on nest abundance of the western chimpanzee, Pan troglodytes verus, at Lagoas de Cufada Natural Park (LCNP), Guinea-Bissau, a forest-savannah mosaic widely disturbed by humans. Further, we assessed patterns of nest height distribution to determine support for the anti-predator hypothesis. A zero-altered generalized linear mixed model showed that nest abundance was negatively related to floristic diversity (exponential form of the Shannon index) and positively with the availability of smaller-sized trees, reflecting characteristics of dense-canopy forest. A positive correlation between nest abundance and floristic richness (number of plant species) and composition indicated that species-rich open habitats are also important in nest site selection. Restricting this analysis to feeding trees, nest abundance was again positively associated with the availability of smaller-sized trees, further supporting the preference for nesting in food tree species from dense forest. Nest tree species selection was non-random, and oil palms were used at a much lower proportion (10%) than previously reported from other study sites in forest-savannah mosaics. While this study suggests that human disturbance may underlie the exclusive arboreal nesting at LCNP, better quantitative data are needed to determine to what extent the construction of elevated nests is in fact a response to predators able to climb trees. Given the importance of LCNP as refuge for Pan t. verus our findings can improve conservation decisions for the management of this important umbrella species as well as its remaining suitable habitats. PMID:25224379

  6. Spatial distribution of dominant arboreal ants in a malagasy coastal rainforest: gaps and presence of an invasive species.

    PubMed

    Dejean, Alain; Fisher, Brian L; Corbara, Bruno; Rarevohitra, Raymond; Randrianaivo, Richard; Rajemison, Balsama; Leponce, Maurice

    2010-01-01

    We conducted a survey along three belt transects located at increasing distances from the coast to determine whether a non-random arboreal ant assemblage, such as an ant mosaic, exists in the rainforest on the Masoala Peninsula, Madagascar. In most tropical rainforests, very populous colonies of territorially dominant arboreal ant species defend absolute territories distributed in a mosaic pattern. Among the 29 ant species recorded, only nine had colonies large enough to be considered potentially territorially dominant; the remaining species had smaller colonies and were considered non-dominant. Nevertheless, the null-model analyses used to examine the spatial structure of their assemblages did not reveal the existence of an ant mosaic. Inland, up to 44% of the trees were devoid of dominant arboreal ants, something not reported in other studies. While two Crematogaster species were not associated with one another, Brachymyrmex cordemoyi was positively associated with Technomyrmex albipes, which is considered an invasive species-a non-indigenous species that has an adverse ecological effect on the habitats it invades. The latter two species and Crematogaster ranavalonae were mutually exclusive. On the other hand, all of the trees in the coastal transect and at least 4 km of coast were occupied by T. albipes, and were interconnected by columns of workers. Technomyrmex albipes workers collected from different trees did not attack each other during confrontation tests, indicating that this species has formed a supercolony along the coast. Yet interspecific aggressiveness did occur between T. albipes and Crematogaster ranavalonae, a native species which is likely territorially dominant based on our intraspecific confrontation tests. These results suggest that the Masoala rainforest is threatened by a potential invasion by T. albipes, and that the penetration of this species further inland might be facilitated by the low density of native, territorially dominant arboreal

  7. Distributed processing for speech understanding

    SciTech Connect

    Bronson, E.C.; Siegel, L.

    1983-01-01

    Continuous speech understanding is a highly complex artificial intelligence task requiring extensive computation. This complexity precludes real-time speech understanding on a conventional serial computer. Distributed processing technique can be applied to the speech understanding task to improve processing speed. In the paper, the speech understanding task and several speech understanding systems are described. Parallel processing techniques are presented and a distributed processing architecture for speech understanding is outlined. 35 references.

  8. UNIX code management and distribution

    SciTech Connect

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process.

  9. Structure functions and parton distributions

    SciTech Connect

    Martin, A.D.; Stirling, W.J.; Roberts, R.G.

    1995-07-01

    The MRS parton distribution analysis is described. The latest sets are shown to give an excellent description of a wide range of deep-inelastic and other hard scattering data. Two important theoretical issues-the behavior of the distributions at small x and the flavor structure of the quark sea-are discussed in detail. A comparison with the new structure function data from HERA is made, and the outlook for the future is discussed.

  10. Generalized parton distributions in nuclei

    SciTech Connect

    Vadim Guzey

    2009-12-01

    Generalized parton distributions (GPDs) of nuclei describe the distribution of quarks and gluons in nuclei probed in hard exclusive reactions, such as e.g. deeply virtual Compton scattering (DVCS). Nuclear GPDs and nuclear DVCS allow us to study new aspects of many traditional nuclear effects (nuclear shadowing, EMC effect, medium modifications of the bound nucleons) as well as to access novel nuclear effects. In my talk, I review recent theoretical progress in the area of nuclear GPDs.

  11. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2010-09-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the COS MAMA Fold Analysis {11891} during Cycle 17.

  12. COS NUV MAMA Fold Distribution

    NASA Astrophysics Data System (ADS)

    Wheeler, Thomas

    2012-10-01

    The performance of the MAMA microchannel plate can be monitored using a MAMA fold analysis procedure. The fold analysis provides a measurement of the distribution of charge cloud sizes incident upon the anode giving some measure of changes in the pulse-height distribution of the MCP and, therefore, MCP gain. This proposal executes the same steps as the COS MAMA Fold Analysis {12723} during Cycle 19.

  13. Modeling Nucleon Generalized Parton Distributions

    SciTech Connect

    Radyushkin, Anatoly V.

    2013-05-01

    We discuss building models for nucleon generalized parton distributions (GPDs) H and E that are based on the formalism of double distributions (DDs). We find that the usual "DD+D-term'' construction should be amended by an extra term, generated by GPD E(x,\\xi). Unlike the $D$-term, this function has support in the whole -1 < x< 1 region, and in general does not vanish at the border points|x|=\\xi.

  14. A fast, easy circumcision procedure combining a CO2 laser and cyanoacrylate adhesive: a non-randomized comparative trial

    PubMed Central

    Gorgulu, Tahsin; Olgun, Abdulkerim; Torun, Merve; Kargi, Eksal

    2016-01-01

    ABSTRACT Background Circumcision is performed as a routine operation in many countries, more commonly for religious and cultural reasons than for indicated conditions, such as phimosis and balanitis. There are many techniques available, and recently electrocautery and both Nd:YAG and CO2 lasers, instead of blades, have been used for skin and mucosal incisions. However, the infection risk in circumcisions performed using a CO2 laser was 10% higher. There are also reports of sutureless procedures using cyanoacrylate, but these have higher risks of hematoma and hemorrhage. We combined a CO2 laser and cyanoacrylate to shorten the operation time and to decrease bleeding complications. Materials and Methods : Circumcisions were performed under general anesthesia with CO2 laser and cyanoacrylate combination in 75 6–9-year-old boys between May 2013 and August 2014 only for religious reasons. As a control, we compared them retrospectively with 75 age-matched patients who were circumcised using the conventional guillotine method in our clinic. Results No hematomas, bleeding, or wound infections were observed. One wound dehiscence (1.33%) occurred during the early postoperative period and healed without any additional procedures. The median operating time was 7 (range 6–9) minutes. The conventional guillotine group comprised one hematoma (1.3%), two wound dehiscences (2.6%), and two hemorrhages (2.6%), and the median operating time was 22 (range 20–26) minutes. The difference in surgical time was significant (p<0.001), with no significant difference in the rate of complications between the two groups. Conclusion The combined CO2 laser and cyanoacrylate procedure not only decreased the operating time markedly, but also eliminated the disadvantages associated with each individual procedure alone. PMID:27136476

  15. Single-port laparoscopic cholecystectomy vs standard laparoscopic cholecystectomy: A non-randomized, age-matched single center trial

    PubMed Central

    van der Linden, Yoen TK; Bosscha, Koop; Prins, Hubert A; Lips, Daniel J

    2015-01-01

    AIM: To compare the safety of single-port laparoscopic cholecystectomies with standard four-port cholecystectomies. METHODS: Between January 2011 and December 2012 datas were gathered from 100 consecutive patients who received a single-port cholecystectomy. Patient baseline characteristics of all 100 single-port cholecystectomies were collected (body mass index, age, etc.) in a database. This group was compared with 100 age-matched patients who underwent a conventional laparoscopic cholecystectomy in the same period. Retrospectively, per- and postoperative data were added. The two groups were compared to each other using independent t-tests and χ2-tests, P values below 0.05 were considered significantly different. RESULTS: No differences were found between both groups regarding baseline characteristics. Operating time was significantly shorter in the total single-port group (42 min vs 62 min, P < 0.05); in procedures performed by surgeons the same trend was seen (45 min vs 59 min, P < 0.05). Peroperative complications between both groups were equal (3 in the single-port group vs 5 in the multiport group; P = 0.42). Although not significant less postoperative complications were seen in the single-port group compared with the multiport group (3 vs 9; P = 0.07). No statistically significant differences were found between both groups with regard to length of hospital stay, readmissions and mortality. CONCLUSION: Single-port laparoscopic cholecystectomy has the potential to be a safe technique with a low complication rate, short in-hospital stay and comparable operating time. Single-port cholecystectomy provides the patient an almost non-visible scar while preserving optimal quality of surgery. Further prospective studies are needed to prove the safety of the single-port technique. PMID:26328034

  16. Non-random migration of CD4+, CD8+ and gamma delta+T19+ lymphocytes through peripheral lymph nodes.

    PubMed Central

    Witherden, D A; Kimpton, W G; Washington, E A; Cahill, R N

    1990-01-01

    The experiments described in this paper have examined the migration of three fluorochrome-labelled T-lymphocyte subsets (CD4+, CD8+ and gamma delta+T19+) on a single passage from blood to lymph, through prescapular lymph nodes. Lymphocytes obtained from prescapular efferent lymph were labelled in vitro with fluorochrome and returned to the blood of the same animal. Over the next 2 days, lymph was continuously monitored and the cells in all collections, including the one used for intravenous infusion, were phenotyped and analysed by flow cytometry. Significant differences in the subset ratios between the infused, starting population and the recirculated population indicated that CD4+ and gamma delta+T19+ lymphocytes are extracted by a resting lymph node at the same rate and that both are extracted at a faster rate than CD8+ lymphocytes. The results presented here also suggest that a unique subset of gamma delta+T19+ lymphocytes may be present in blood that does not recirculate through peripheral lymph nodes. PMID:2115500

  17. Non-random expression of ribosomal DNA units in a grasshopper showing high intragenomic variation for the ITS2 region.

    PubMed

    Ruiz-Estévez, M; Ruiz-Ruano, F J; Cabrero, J; Bakkali, M; Perfectti, F; López-León, M D; Camacho, J P M

    2015-06-01

    We analyse intragenomic variation of the ITS2 internal transcribed spacer of ribosomal DNA (rDNA) in the grasshopper Eyprepocnemis plorans, by means of tagged PCR 454 amplicon sequencing performed on both genomic DNA (gDNA) and RNA-derived complementary DNA (cDNA), using part of the ITS2 flanking coding regions (5.8S and 28S rDNA) as an internal control for sequencing errors. Six different ITS2 haplotypes (i.e. variants for at least one nucleotide in the complete ITS2 sequence) were found in a single population, one of them (Hap4) being specific to a supernumerary (B) chromosome. The analysis of both gDNA and cDNA from the same individuals provided an estimate of the expression efficiency of the different haplotypes. We found random expression (i.e. about similar recovery in gDNA and cDNA) for three haplotypes (Hap1, Hap2 and Hap5), but significant underexpression for three others (Hap3, Hap4 and Hap6). Hap4 was the most extremely underexpressed and, remarkably, it showed the lowest sequence conservation for the flanking 5.8-28S coding regions in the gDNA reads but the highest conservation (100%) in the cDNA ones, suggesting the preferential expression of mutation-free rDNA units carrying this ITS2 haplotype. These results indicate that the ITS2 region of rDNA is far from complete homogenization in this species, and that the different rDNA units are not expressed at random, with some of them being severely downregulated. PMID:25565136

  18. The ATLAS distributed analysis system

    NASA Astrophysics Data System (ADS)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  19. Double distributions and evolution equations

    SciTech Connect

    A.V. Radyushkin

    1998-05-01

    Applications of perturbative QCD to deeply virtual Compton scattering and hard exclusive meson electroproduction processes require a generalization of usual parton distributions for the case when long-distance information is accumulated in nonforward matrix elements < p{prime} {vert_bar}O(0,z){vert_bar}p > of quark and gluon light-cone operators. In their previous papers the authors used two types of nonperturbative functions parameterizing such matrix elements: double distributions F(x,y;t) and nonforward distribution functions F{sub {zeta}}(X;t). Here they discuss in more detail the double distributions (DD's) and evolution equations which they satisfy. They propose simple models for F(x,y;t=0) DD's with correct spectral and symmetry properties which also satisfy the reduction relations connecting them to the usual parton densities f(x). In this way, they obtain self-consistent models for the {zeta}-dependence of nonforward distributions. They show that, for small {zeta}, one can easily obtain nonforward distributions (in the X > {zeta} region) from the parton densities: F{sub {zeta}} (X;t=0) {approx} f(X{minus}{zeta}/2).

  20. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  1. The RHESSI Microflare Height Distribution

    NASA Technical Reports Server (NTRS)

    Christe, P.; Krucker, S.; Saint-Hilaire, P.

    2011-01-01

    We present the first in-depth statistical survey of flare source heights observed by RHESSI. Flares were found using a flare-finding algorithm designed to search the 6-10 keV count-rate when RHESSI's full sensitivity was available in order to find the smallest events (Christe et al., 2008). Between March 2002 and March 2007, a total of 25,006 events were found. Source locations were determined in the 4-10 keV, 10-15 keV, and 15-30 keV energy ranges for each event. In order to extract the height distribution from the observed projected source positions, a forward-fit model was developed with an assumed source height distribution where height is measured from the photosphere. We find that the best flare height distribution is given by g (h) oc exp(-h/lambda) where lambda = 6.1 plus or minus 0.3 Mm is the scale height. A power law height distribution with a negative power law index, gamma = 3.1 plus or minus 0.3 is also consistent with the data. Interpreted as thermal loop top sources, these heights are compared to loops generated by a potential field model (PFSS). The measured flare heights distribution are found to be much steeper than the potential field loop height distribution which may be a signature of the flare energization process.

  2. Distribution System of the Future

    SciTech Connect

    Kueck, JD

    2003-04-23

    The distribution system of the future is going to be as much of a revolution to the electric energy industry as the wireless telephone has been to consumer communications. An electricity market transformation must occur before the changes can take place, but this evolution is already starting to occur in many parts of the country. In this paper, we discuss a vision for a future distribution system, areas that will be key for technology development, and the advantages of the new electricity market. Present day distribution systems are in a sense, unintelligent. Distribution systems respond to faults, or short circuits, by sensing the very high fault current and then opening circuit breakers to isolate the fault. Some newer automated systems determine fault location and then close other circuit breakers to provide an alternate path for power after the fault so that the number of customers left without power is minimized, but the extent of the reconfiguration is limited. Distribution systems also have some methods to regulate voltage, but there is little real time local response to contingencies such as loss of a transmission line or a generator. In present day distribution systems, there is very little control of load, or demand response, and Distributed Energy Resources (DER, distributed generation, storage, and responsive load) located in the distribution system are prohibited from even regulating voltage. In fact, industry standards and utility interconnection agreements typically require that when a contingency occurs on a distribution or transmission system that results in a voltage or frequency excursion, the DER is to disconnect rather than help. There is a pressing need to evolve the distribution system model to one that can respond to contingencies sensed locally, and has the local intelligence and autonomy to deal with contingencies such as unusual loading, transmission congestion, and line outages. Markets must be simple for customers to participate in the

  3. Fluorescence lifetime distributions in proteins.

    PubMed Central

    Alcala, J. R.; Gratton, E.; Prendergast, F. G.

    1987-01-01

    The fluorescence lifetime value of tryptophan residues varies by more than a factor of 100 in different proteins and is determined by several factors, which include solvent exposure and interactions with other elements of the protein matrix. Because of the variety of different elements that can alter the lifetime value and the sensitivity to the particular environment of the tryptophan residue, it is likely that non-unique lifetime values result in protein systems. The emission decay of most proteins can be satisfactorily described only using several exponential components. Here it is proposed that continuous lifetime distributions can better represent the observed decay. An approach based on protein dynamics is presented, which provides fluorescence lifetime distribution functions for single tryptophan residue proteins. First, lifetime distributions for proteins interconverting between two conformations, each characterized by a different lifetime value, are derived. The evolution of the lifetime values as a function of the interconversion rate is studied. In this case lifetime distributions can be obtained from a distribution of rates of interconversion between the two conformations. Second, the existence of a continuum of energy substates within a given conformation was considered. The occupation of a particular energy substate at a given temperature is proportional to the Boltzmann factor. The density of energy states of the potential well depends upon the width of the well, which determines the degree of freedom the residue can move in the conformational space. Lifetime distributions can be obtained by association of each energy substate with a different lifetime value and assuming that the average conformation can change as the energy of the substate is increased. Finally, lifetime distributions for proteins interconverting between two conformations, each characterized by a quasi-continuum of energy substates, are presented. The origin of negative components

  4. Knowledge in a distributed environment

    SciTech Connect

    Moses, Y.O.

    1986-01-01

    The distributed nature of information in a distributed system is one of the major issues that protocols for cooperation and coordination between individual components in such a system must handle. Individual sites customarily have only partial knowledge about the general state of the system. Moreover, different information is available at the different sites of the system. Consequently, a central role of communication in such protocols is to inform particular sites about events that take place at other sites, and to transform the system's state of knowledge in a way that will guarantee the successful achievement of the goals of the protocol. This thesis is an initial attempt to study the role of knowledge in distributed system. A general framework is presented for defining knowledge in a distributed system, and a variety of states of knowledge are identified that groups of processors may have. These states of knowledge seem to capture basic aspects of coordinated actions in a distributed environment. This machinery is applied to the analysis of a number of problems. Finally, this machinery is applied to the study of fault tolerance in systems of unreliable processors, providing considerable insight into the Byzantine agreement problem, and obtaining improved protocols for Byzantine agreement and many related problems.

  5. Distributed Wind Policy Comparison Tool

    SciTech Connect

    2011-12-01

    Power through Policy: 'Best Practices' for Cost-Effective Distributed Wind is a U.S. Department of Energy (DOE)-funded project to identify distributed wind technology policy best practices and to help policymakers, utilities, advocates, and consumers examine their effectiveness using a pro forma model. Incorporating a customized feed from the Database of State Incentives for Renewables and Efficiency (DSIRE), the Web-based Distributed Wind Policy Comparison Tool (Policy Tool) is designed to assist state, local, and utility officials in understanding the financial impacts of different policy options to help reduce the cost of distributed wind technologies. The Policy Tool can be used to evaluate the ways that a variety of federal and state policies and incentives impact the economics of distributed wind (and subsequently its expected market growth). It also allows policymakers to determine the impact of policy options, addressing market challenges identified in the U.S. DOE’s '20% Wind Energy by 2030' report and helping to meet COE targets.

  6. Size distribution of ring polymers

    PubMed Central

    Medalion, Shlomi; Aghion, Erez; Meirovitch, Hagai; Barkai, Eli; Kessler, David A.

    2016-01-01

    We present an exact solution for the distribution of sample averaged monomer to monomer distance of ring polymers. For non-interacting and local-interaction models these distributions correspond to the distribution of the area under the reflected Bessel bridge and the Bessel excursion respectively, and are shown to be identical in dimension d ≥ 2, albeit with pronounced finite size effects at the critical dimension, d = 2. A symmetry of the problem reveals that dimension d and 4 − d are equivalent, thus the celebrated Airy distribution describing the areal distribution of the d = 1 Brownian excursion describes also a polymer in three dimensions. For a self-avoiding polymer in dimension d we find numerically that the fluctuations of the scaled averaged distance are nearly identical in dimension d = 2, 3 and are well described to a first approximation by the non-interacting excursion model in dimension 5. PMID:27302596

  7. Distributed resource management: garbage collection

    SciTech Connect

    Bagherzadeh, N.

    1987-01-01

    In recent years, there has been a great interest in designing high-performance distributed symbolic-processing computers. These architectures have special needs for resource management and dynamic reclamation of unused memory cells and objects. The memory management or garbage-collection aspects of these architectures are studied. Also introduced is a synchronous distributed algorithm for garbage collection. A special data structure is defined to handle the distributed nature of the problem. The author formally expresses the algorithm and shows the results of a synchronous garbage-collection simulation and its effect on the interconnection-network message to traffic. He presents an asynchronous distributed garbage collection to handle the resource management for a system that does not require a global synchronization mechanism. The distributed data structure is modified to include the asynchronous aspects of the algorithm. This method is extended to a multiple-mutator scheme, and the problem of having several processors share portion of a cyclical graph is discussed. Two models for the analytical study of the garbage-collection algorithms discussed are provided.

  8. Size distribution of ring polymers.

    PubMed

    Medalion, Shlomi; Aghion, Erez; Meirovitch, Hagai; Barkai, Eli; Kessler, David A

    2016-01-01

    We present an exact solution for the distribution of sample averaged monomer to monomer distance of ring polymers. For non-interacting and local-interaction models these distributions correspond to the distribution of the area under the reflected Bessel bridge and the Bessel excursion respectively, and are shown to be identical in dimension d ≥ 2, albeit with pronounced finite size effects at the critical dimension, d = 2. A symmetry of the problem reveals that dimension d and 4 - d are equivalent, thus the celebrated Airy distribution describing the areal distribution of the d = 1 Brownian excursion describes also a polymer in three dimensions. For a self-avoiding polymer in dimension d we find numerically that the fluctuations of the scaled averaged distance are nearly identical in dimension d = 2, 3 and are well described to a first approximation by the non-interacting excursion model in dimension 5. PMID:27302596

  9. Integrated Transmission and Distribution Control

    SciTech Connect

    Kalsi, Karanjit; Fuller, Jason C.; Tuffner, Francis K.; Lian, Jianming; Zhang, Wei; Marinovici, Laurentiu D.; Fisher, Andrew R.; Chassin, Forrest S.; Hauer, Matthew L.

    2013-01-16

    Distributed, generation, demand response, distributed storage, smart appliances, electric vehicles and renewable energy resources are expected to play a key part in the transformation of the American power system. Control, coordination and compensation of these smart grid assets are inherently interlinked. Advanced control strategies to warrant large-scale penetration of distributed smart grid assets do not currently exist. While many of the smart grid technologies proposed involve assets being deployed at the distribution level, most of the significant benefits accrue at the transmission level. The development of advanced smart grid simulation tools, such as GridLAB-D, has led to a dramatic improvement in the models of smart grid assets available for design and evaluation of smart grid technology. However, one of the main challenges to quantifying the benefits of smart grid assets at the transmission level is the lack of tools and framework for integrating transmission and distribution technologies into a single simulation environment. Furthermore, given the size and complexity of the distribution system, it is crucial to be able to represent the behavior of distributed smart grid assets using reduced-order controllable models and to analyze their impacts on the bulk power system in terms of stability and reliability. The objectives of the project were to: • Develop a simulation environment for integrating transmission and distribution control, • Construct reduced-order controllable models for smart grid assets at the distribution level, • Design and validate closed-loop control strategies for distributed smart grid assets, and • Demonstrate impact of integrating thousands of smart grid assets under closed-loop control demand response strategies on the transmission system. More specifically, GridLAB-D, a distribution system tool, and PowerWorld, a transmission planning tool, are integrated into a single simulation environment. The integrated environment

  10. Phenomenology of preequilibrium angular distributions

    SciTech Connect

    Kalbach, C.; Mann, F.M.

    1980-05-01

    The systematics of continuum angular distributions from a wide variety of light ion nuclear reactions have been studied. To first order, the shape of the angular distributions have been found to depend only on the energy of the outgoing particle and on the division of the cross section into multi-step direct and multi-step compound parts. The angular distributions can be described in terms of Legendre polynomials with the reduced polynomial coefficients exhibiting a simple dependence on the outgoing particle energy. Two integer and four continuous parameters with universal values are needed to describe the coefficients for outgoing energies of 2 to 60 MeV in all the reaction types studied. This parameterization combined with a modified Griffin model computer code permits the calculation of double differential cross sections for light ion continuum reactions where no data is available.

  11. Oceanic Satellite Data Distribution System

    NASA Technical Reports Server (NTRS)

    Montgomery, D. R.

    1980-01-01

    The Satellite Data Distribution System (SDDS) serves to process satellite-derived ocean observations, generate ocean analysis and forecast products, and distribute the products to a limited set of commercial users. The SDDS functions in series with the U.S. Navy Fleet Numerical Oceanography Center (FNOC) to provide products on a near-real-time basis to commercial marine industries. Conventional meteorological and oceanographic observations provided to FNOC serve as the input set to the numerical analysis and forecast models. Large main-frame computers are used to analyze and forecast products on a routine, operational basis (at 6-hour and 12-hour synoptic times). The products, reformatted to meet commercial users needs, are transferred to a NASA-owned computer for storage and distribution. Access to the information is possible either by a commercial dial-up packet-switching network or by a direct computer-computer connection.

  12. Distributed semantic networks and CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Rodriguez, Tony

    1991-01-01

    Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.

  13. What makes distributed practice effective?

    PubMed Central

    Benjamin, Aaron S.; Tullis, Jonathan

    2010-01-01

    The advantages provided to memory by the distribution of multiple practice or study opportunities are among the most powerful effects in memory research. In this paper, we critically review the class of theories that presume contextual or encoding variability as the sole basis for the advantages of distributed practice, and recommend an alternative approach based on the idea that some study events remind learners of other study events. Encoding variability theory encounters serious challenges in two important phenomena that we review here: superadditivity and nonmonotonicity. The bottleneck in such theories lies in the assumption that mnemonic benefits arise from the increasing independence, rather than interdependence, of study opportunities. The reminding model accounts for many basic results in the literature on distributed practice, readily handles data that are problematic for encoding variability theories, including superadditivity and nonmonotonicity, and provides a unified theoretical framework for understanding the effects of repetition and the effects of associative relationships on memory. PMID:20580350

  14. Maintaining consistency in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  15. Modern Messaging for Distributed Sytems

    NASA Astrophysics Data System (ADS)

    Magnoni, L.

    2015-05-01

    Modern software applications rarely live in isolation and nowadays it is common practice to rely on services or consume information provided by remote entities. In such a distributed architecture, integration is key. Messaging, for more than a decade, is the reference solution to tackle challenges of a distributed nature, such as network unreliability, strong-coupling of producers and consumers and the heterogeneity of applications. Thanks to a strong community and a common effort towards standards and consolidation, message brokers are today the transport layer building blocks in many projects and services, both within the physics community and outside. Moreover, in recent years, a new generation of messaging services has appeared, with a focus on low-latency and high-performance use cases, pushing the boundaries of messaging applications. This paper will present messaging solutions for distributed applications going through an overview of the main concepts, technologies and services.

  16. Distributed phased array architecture study

    NASA Technical Reports Server (NTRS)

    Bourgeois, Brian

    1987-01-01

    Variations in amplifiers and phase shifters can cause degraded antenna performance, depending also on the environmental conditions and antenna array architecture. The implementation of distributed phased array hardware was studied with the aid of the DISTAR computer program as a simulation tool. This simulation provides guidance in hardware simulation. Both hard and soft failures of the amplifiers in the T/R modules are modeled. Hard failures are catastrophic: no power is transmitted to the antenna elements. Noncatastrophic or soft failures are modeled as a modified Gaussian distribution. The resulting amplitude characteristics then determine the array excitation coefficients. The phase characteristics take on a uniform distribution. Pattern characteristics such as antenna gain, half power beamwidth, mainbeam phase errors, sidelobe levels, and beam pointing errors were studied as functions of amplifier and phase shifter variations. General specifications for amplifier and phase shifter tolerances in various architecture configurations for C band and S band were determined.

  17. Distributional Cost-Effectiveness Analysis

    PubMed Central

    Asaria, Miqdad; Griffin, Susan; Cookson, Richard

    2015-01-01

    Distributional cost-effectiveness analysis (DCEA) is a framework for incorporating health inequality concerns into the economic evaluation of health sector interventions. In this tutorial, we describe the technical details of how to conduct DCEA, using an illustrative example comparing alternative ways of implementing the National Health Service (NHS) Bowel Cancer Screening Programme (BCSP). The 2 key stages in DCEA are 1) modeling social distributions of health associated with different interventions, and 2) evaluating social distributions of health with respect to the dual objectives of improving total population health and reducing unfair health inequality. As well as describing the technical methods used, we also identify the data requirements and the social value judgments that have to be made. Finally, we demonstrate the use of sensitivity analyses to explore the impacts of alternative modeling assumptions and social value judgments. PMID:25908564

  18. Distributed systems status and control

    NASA Technical Reports Server (NTRS)

    Kreidler, David; Vickers, David

    1990-01-01

    Concepts are investigated for an automated status and control system for a distributed processing environment. System characteristics, data requirements for health assessment, data acquisition methods, system diagnosis methods and control methods were investigated in an attempt to determine the high-level requirements for a system which can be used to assess the health of a distributed processing system and implement control procedures to maintain an accepted level of health for the system. A potential concept for automated status and control includes the use of expert system techniques to assess the health of the system, detect and diagnose faults, and initiate or recommend actions to correct the faults. Therefore, this research included the investigation of methods by which expert systems were developed for real-time environments and distributed systems. The focus is on the features required by real-time expert systems and the tools available to develop real-time expert systems.

  19. Distributed Scheduling Extension on Hadoop

    NASA Astrophysics Data System (ADS)

    Dadan, Zeng; Xieqin, Wang; Ningkang, Jiang

    Distributed computing splits a large-scale job into multiple tasks and deals with them on clusters. Cluster resource allocation is the key point to restrict the efficiency of distributed computing platform. Hadoop is the current most popular open-source distributed platform. However, the existing scheduling strategies in Hadoop are kind of simple and cannot meet the needs such as sharing the cluster for multi-user, ensuring a concept of guaranteed capacity for each job, as well as providing good performance for interactive jobs. This paper researches the existing scheduling strategies, analyses the inadequacy and adds three new features in Hadoop which can raise the weight of job temporarily, grab cluster resources by higher-priority jobs and support the computing resources share among multi-user. Experiments show they can help in providing better performance for interactive jobs, as well as more fairly share of computing time among users.

  20. Shape of Pion Distribution Amplitude

    SciTech Connect

    Radyushkin, Anatoly

    2009-11-01

    A scenario is investigated in which the leading-twist pion distribution amplitude $\\varphi_\\pi (x)$ is approximated by the pion decay constant $f_\\pi$ for all essential values of the light-cone fraction $x$. A model for the light-front wave function $\\Psi (x, k_\\perp)$ is proposed that produces such a distribution amplitude and has a rapidly decreasing (exponential for definiteness) dependence on the light-front energy combination $ k_\\perp^2/x(1-x)$. It is shown that this model easily reproduces the fit of recent large-$Q^2$ BaBar data on the photon-pion transition form factor. Some aspects of scenario with flat pion distribution amplitude are discussed.

  1. How robust are distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1989-01-01

    A distributed system is made up of large numbers of components operating asynchronously from one another and hence with imcomplete and inaccurate views of one another's state. Load fluctuations are common as new tasks arrive and active tasks terminate. Jointly, these aspects make it nearly impossible to arrive at detailed predictions for a system's behavior. It is important to the successful use of distributed systems in situations in which humans cannot provide the sorts of predictable realtime responsiveness of a computer, that the system be robust. The technology of today can too easily be affected by worn programs or by seemingly trivial mechanisms that, for example, can trigger stock market disasters. Inventors of a technology have an obligation to overcome flaws that can exact a human cost. A set of principles for guiding solutions to distributed computing problems is presented.

  2. Overdispersion: Notes on discrete distributions

    SciTech Connect

    Bowman, K.O.; Shenton, L.R.; Kastenbaum, M.A.; Broman, K.

    1992-09-01

    We introduce mixtures of binomial distributions derived by assuming that the probability parameter p varies according to some law. We use the transformation p = exp({minus}t) and consider various appropriate densities for the transformed variables. In the process, the Laplace transform becomes the fundamental entity. Large numbers of new binomial mixtures are generated in this way. Some transformations may involve several variates that lead to ``multivariate`` binomial mixtures. An extension of this to the logarithmic distribution, with parameter p, is possible. Frullani integrals and Laplace transforms are encountered. Graphical representations of some of the more significant distributions are given. These include probability functions, regions of validity, and three dimensional representations of probability functions showing the response to variation of parameters when two parameters are involved.

  3. Overdispersion: Notes on discrete distributions

    SciTech Connect

    Bowman, K.O. ); Shenton, L.R. ); Kastenbaum, M.A. ); Broman, K. )

    1992-09-01

    We introduce mixtures of binomial distributions derived by assuming that the probability parameter p varies according to some law. We use the transformation p = exp([minus]t) and consider various appropriate densities for the transformed variables. In the process, the Laplace transform becomes the fundamental entity. Large numbers of new binomial mixtures are generated in this way. Some transformations may involve several variates that lead to multivariate'' binomial mixtures. An extension of this to the logarithmic distribution, with parameter p, is possible. Frullani integrals and Laplace transforms are encountered. Graphical representations of some of the more significant distributions are given. These include probability functions, regions of validity, and three dimensional representations of probability functions showing the response to variation of parameters when two parameters are involved.

  4. Hydronic distribution system computer model

    SciTech Connect

    Andrews, J.W.; Strasser, J.J.

    1994-10-01

    A computer model of a hot-water boiler and its associated hydronic thermal distribution loop has been developed at Brookhaven National Laboratory (BNL). It is intended to be incorporated as a submodel in a comprehensive model of residential-scale thermal distribution systems developed at Lawrence Berkeley. This will give the combined model the capability of modeling forced-air and hydronic distribution systems in the same house using the same supporting software. This report describes the development of the BNL hydronics model, initial results and internal consistency checks, and its intended relationship to the LBL model. A method of interacting with the LBL model that does not require physical integration of the two codes is described. This will provide capability now, with reduced up-front cost, as long as the number of runs required is not large.

  5. Distribution Metrics and Image Segmentation

    PubMed Central

    Georgiou, Tryphon; Michailovich, Oleg; Rathi, Yogesh; Malcolm, James; Tannenbaum, Allen

    2007-01-01

    The purpose of this paper is to describe certain alternative metrics for quantifying distances between distributions, and to explain their use and relevance in visual tracking. Besides the theoretical interest, such metrics may be used to design filters for image segmentation, that is for solving the key visual task of separating an object from the background in an image. The segmenting curve is represented as the zero level set of a signed distance function. Most existing methods in the geometric active contour framework perform segmentation by maximizing the separation of intensity moments between the interior and the exterior of an evolving contour. Here one can use the given distributional metric to determine a flow which minimizes changes in the distribution inside and outside the curve. PMID:18769529

  6. Distributive Marketing Education: Innovative Instructional Techniques in Distributive Marketing Education.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    The conference featured more than 40 presentations representing existing and planned innovative programs in all levels of distributive marketing education in six States. In addition to the presentations (not reproduced in their entirety in the report), there were sessions and workshops for secondary, post secondary, and adult levels and for city…

  7. CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was

  8. Cloud Distribution Statistics from LITE

    NASA Technical Reports Server (NTRS)

    Winker, David M.

    1998-01-01

    The Lidar In-Space Technology Experiment (LITE) mission has demonstrated the utility of spaceborne lidar in observing multilayer clouds and has provided a dataset showing the distribution of tropospheric clouds and aerosols. These unambiguous observations of the vertical distribution of clouds will allow improved verification of current cloud climatologies and GCM cloud parameterizations. Although there is now great interest in cloud profiling radar, operating in the mm-wave region, for the spacebased observation of cloud heights the results of the LITE mission have shown that satellite lidars can also make significant contributions in this area.

  9. SINGULARITIES OF GENERALIZED PARTON DISTRIBUTIONS

    SciTech Connect

    Anatoly Radyushkin

    2012-12-01

    We discuss recent developments in building models for generalized parton distributions (GPDs) that are based on the formalism of double distributions (DDs). A special attention is given to a careful analysis of the singularity structure of DDs. The DD formalism is applied to construction of a model GPDs with a singular Regge behavior. Within the developed DD-based approach, we discuss the structure of GPD sum rules. It is shown that separation of DDs into the so-called ``plus'' part and the D-term part may be treated as a renormalization procedure for the GPD sum rules. This approach is compared with an alternative prescription based on analytic regularization.

  10. Shared versus distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The question of whether multiprocessors should have shared or distributed memory has attracted a great deal of attention. Some researchers argue strongly for building distributed memory machines, while others argue just as strongly for programming shared memory multiprocessors. A great deal of research is underway on both types of parallel systems. Special emphasis is placed on systems with a very large number of processors for computation intensive tasks and considers research and implementation trends. It appears that the two types of systems will likely converge to a common form for large scale multiprocessors.

  11. Medium Effects in Parton Distributions

    SciTech Connect

    William Detmold, Huey-Wen Lin

    2011-12-01

    A defining experiment of high-energy physics in the 1980s was that of the EMC collaboration where it was first observed that parton distributions in nuclei are non-trivially related to those in the proton. This result implies that the presence of the nuclear medium plays an important role and an understanding of this from QCD has been an important goal ever since Here we investigate analogous, but technically simpler, effects in QCD and examine how the lowest moment of the pion parton distribution is modified by the presence of a Bose-condensed gas of pions or kaons.

  12. Distributional preferences and competitive behavior☆

    PubMed Central

    Balafoutas, Loukas; Kerschbamer, Rudolf; Sutter, Matthias

    2012-01-01

    We study experimentally the relationship between distributional preferences and competitive behavior. We find that spiteful subjects react strongest to competitive pressure and win in a tournament significantly more often than efficiency-minded and inequality averse subjects. However, when given the choice between a tournament and a piece rate scheme, efficiency-minded subjects choose the tournament most often, while spiteful and inequality averse subjects avoid it. When controlling for distributional preferences, risk attitudes and past performance, the gender gap in the willingness to compete is no longer significant, indicating that gender-related variables explain why twice as many men as women self-select into competition. PMID:23576829

  13. Modeling Nucleon Generalized Parton Distributions

    SciTech Connect

    Radyushkin, Anatoly V.

    2013-05-01

    We discuss building models for nucleon generalized parton distributions (GPDs) H and E that are based on the formalism of double distributions (DDs). We found that the usual "DD+D-term" construction should be amended by an extra term, xiE^1_+ (x,xi) built from the alpha/Beta moment of the DD e(Beta,alpha) that generates GPD E(x,xi). Unlike the D-term, this function has support in the whole -1< x<1 region, and in general does not vanish at the border points |x|=xi.

  14. Standard Distributions: One Graph Fits All

    ERIC Educational Resources Information Center

    Wagner, Clifford H.

    2007-01-01

    Standard distributions are ubiquitous but not unique. With suitable scaling, the graph of a standard distribution serves as the graph for every distribution in the family. The standard exponential can easily be taught in elementary statistics courses.

  15. Are quasar redshifts randomly distributed

    NASA Technical Reports Server (NTRS)

    Weymann, R. J.; Boroson, T.; Scargle, J. D.

    1978-01-01

    A statistical analysis of possible clumping (not periodicity) of emission line redshifts of QSO's shows the available data to be compatible with random fluctuations of a smooth, non-clumped distribution. This result is demonstrated with Monte Carlo simulations as well as with the Kolmogorov-Smirnov test. It is in complete disagreement with the analysis by Varshni, which is shown to be incorrect.

  16. 76 FR 42768 - Capital Distribution

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Office of Thrift Supervision Capital Distribution AGENCY: Office of Thrift Supervision (OTS), Treasury... Reduction Act of 1995, 44 U.S.C. 3507. The Office of Thrift Supervision within the Department of the... Thrift Supervision, 1700 G Street, NW., Washington, DC 20552. FOR FURTHER INFORMATION CONTACT: You...

  17. Distributed user services for supercomputers

    NASA Technical Reports Server (NTRS)

    Sowizral, Henry A.

    1989-01-01

    User-service operations at supercomputer facilities are examined. The question is whether a single, possibly distributed, user-services organization could be shared by NASA's supercomputer sites in support of a diverse, geographically dispersed, user community. A possible structure for such an organization is identified as well as some of the technologies needed in operating such an organization.

  18. Distributed Leadership: Democracy or Delivery?

    ERIC Educational Resources Information Center

    Hargreaves, Andy; Fink, Dean

    2008-01-01

    Purpose: This article aims to discusses the nature and benefits of lateral approaches to educational change, especially in the form of distributed leadership, that treat schools, localities, states, or nations, as "living systems" interconnected by mutual influence. Design/methodology/approach: The paper presents a conceptual discussion of the…

  19. A distributed telerobotics construction set

    NASA Technical Reports Server (NTRS)

    Wise, James D.

    1994-01-01

    During the course of our research on distributed telerobotic systems, we have assembled a collection of generic, reusable software modules and an infrastructure for connecting them to form a variety of telerobotic configurations. This paper describes the structure of this 'Telerobotics Construction Set' and lists some of the components which comprise it.

  20. Career Information: Marketing and Distribution.

    ERIC Educational Resources Information Center

    American Vocational Association, Inc., Washington, DC.

    The publication is a bibliography prepared in an attempt to assist guidance and distributive education personnel in their task of securing relevant published career information. Depending on overall adequacy, three categories of the National Vocational Guidance Association (NVGA)--highly recommended, recommended, and useful--were used in rating…

  1. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Wood, Mark; Cooper, Robert; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system is described: a collection of tools for constructing distributed application management software. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real time reactive program. The underlying application is instrumented with a variety of built-in and user defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when pre-existing, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  2. Tools for distributed application management

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Cooper, Robert; Wood, Mark; Birman, Kenneth P.

    1990-01-01

    Distributed application management consists of monitoring and controlling an application as it executes in a distributed environment. It encompasses such activities as configuration, initialization, performance monitoring, resource scheduling, and failure response. The Meta system (a collection of tools for constructing distributed application management software) is described. Meta provides the mechanism, while the programmer specifies the policy for application management. The policy is manifested as a control program which is a soft real-time reactive program. The underlying application is instrumented with a variety of built-in and user-defined sensors and actuators. These define the interface between the control program and the application. The control program also has access to a database describing the structure of the application and the characteristics of its environment. Some of the more difficult problems for application management occur when preexisting, nondistributed programs are integrated into a distributed application for which they may not have been intended. Meta allows management functions to be retrofitted to such programs with a minimum of effort.

  3. Distributed Leadership in Educational Institutions

    ERIC Educational Resources Information Center

    Göksoy, Süleyman

    2015-01-01

    In recent years, many studies are conducted about shared leadership process. Distributed leadership (DL) approach addresses leadership along with teams, groups and organizational characteristics. In practice, this approach objects the supposition that an individual should take the lead in order to ensure change. Proponents of this idea claim that…

  4. Parallel, Distributed Scripting with Python

    SciTech Connect

    Miller, P J

    2002-05-24

    Parallel computers used to be, for the most part, one-of-a-kind systems which were extremely difficult to program portably. With SMP architectures, the advent of the POSIX thread API and OpenMP gave developers ways to portably exploit on-the-box shared memory parallelism. Since these architectures didn't scale cost-effectively, distributed memory clusters were developed. The associated MPI message passing libraries gave these systems a portable paradigm too. Having programmers effectively use this paradigm is a somewhat different question. Distributed data has to be explicitly transported via the messaging system in order for it to be useful. In high level languages, the MPI library gives access to data distribution routines in C, C++, and FORTRAN. But we need more than that. Many reasonable and common tasks are best done in (or as extensions to) scripting languages. Consider sysadm tools such as password crackers, file purgers, etc ... These are simple to write in a scripting language such as Python (an open source, portable, and freely available interpreter). But these tasks beg to be done in parallel. Consider the a password checker that checks an encrypted password against a 25,000 word dictionary. This can take around 10 seconds in Python (6 seconds in C). It is trivial to parallelize if you can distribute the information and co-ordinate the work.

  5. Distributed Learning and Institutional Restructuring.

    ERIC Educational Resources Information Center

    Hawkins, Brian L.

    1999-01-01

    Discusses the following challenges institutions must consider as they enter the new marketplace of distributed learning: library access, faculty workload, faculty incentives, faculty-support structures, intellectual property, articulation agreements, financial aid, pricing, cross-subsidization of programs, institutional loyalty and philanthropy,…

  6. Requiring Collaboration or Distributing Leadership?

    ERIC Educational Resources Information Center

    Kennedy, Anne; Deuel, Angie; Nelson, Tamara Holmlund; Slavit, David

    2011-01-01

    Through the process of initiating, implementing, and sustaining a schoolwide professional learning community (PLC), teachers and administrators at the pseudonymous Silver Valley Middle School provide a powerful example of distributed leadership in action. New leadership roles, coordination, and interdependency among staff have led to an increased…

  7. Educational Micropolitics and Distributed Leadership

    ERIC Educational Resources Information Center

    Flessa, Joseph

    2009-01-01

    This article critically reviews two bodies of literature that potentially share common concerns, yet rarely overlap: distributed leadership and educational micropolitics. Alternative explanations for the split between these two analytical approaches to school organization are explored in sections on problem framing, methodology, and the…

  8. Size distribution of detached drops

    NASA Astrophysics Data System (ADS)

    Baluev, V. V.; Stepanov, V. M.

    1989-10-01

    The law governing the size distribution of detached gas-liquid streams of drops has been determined analytically, and a comparison is carried out against experimental data existing in the literature. The derived theoretical relationships offer an excellent description of existing experimental results.

  9. Bug Distribution and Pattern Classification.

    ERIC Educational Resources Information Center

    Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.

    The study examines the rule space model, a probabilistic model capable of measuring cognitive skill acquisition and of diagnosing erroneous rules of operation in a procedural domain. The model involves two important components: (1) determination of a set of bug distributions (bug density functions representing clusters around the rules); and (2)…

  10. Modeling Natural Variation through Distribution

    ERIC Educational Resources Information Center

    Lehrer, Richard; Schauble, Leona

    2004-01-01

    This design study tracks the development of student thinking about natural variation as late elementary grade students learned about distribution in the context of modeling plant growth at the population level. The data-modeling approach assisted children in coordinating their understanding of particular cases with an evolving notion of data as an…

  11. Prior Distributions on Symmetric Groups

    ERIC Educational Resources Information Center

    Gupta, Jayanti; Damien, Paul

    2005-01-01

    Fully and partially ranked data arise in a variety of contexts. From a Bayesian perspective, attention has focused on distance-based models; in particular, the Mallows model and extensions thereof. In this paper, a class of prior distributions, the "Binary Tree," is developed on the symmetric group. The attractive features of the class are: it…

  12. Is Creative Thinking Normally Distributed?

    ERIC Educational Resources Information Center

    Wakefield, John F.

    The hypothesis of positive skew in distributions of response to creative thinking tasks was studied. Data were obtained from examinees' responses to problem-solving tasks in three published studies of creative thinking. Subjects included 23 fifth graders (12 females and 11 males), 29 high school students (10 females and 19 males), and 47 female…

  13. Cooperative distributed architecture for mashups

    NASA Astrophysics Data System (ADS)

    Al-Haj Hassan, Osama Mohammad; Ramaswamy, Lakshmish; Hamad, Fadi; Abu Taleb, Anas

    2014-05-01

    Since the advent of Web 2.0, personalised applications such as mashups have become widely popular. Mashups enable end-users to fetch data from distributed data sources, and refine it based on their personal needs. This high degree of personalisation that mashups offer comes at the expense of performance and scalability. These scalability challenges are exacerbated by the centralised architectures of current mashup platforms. In this paper, we address the performance and scalability issues by designing CoMaP - a distributed mashup platform. CoMaP's architecture comprises of several cooperative mashup processing nodes distributed over the Internet upon which mashups can, fully or partially, be executed. CoMaP incorporates a dynamic and efficient scheme for deploying mashups on the processing nodes. Our scheme considers a number of parameters such as variations in link delays and bandwidths, and loads on mashup processing nodes. CoMaP includes effective and low-cost mechanisms for balancing loads on the processing nodes as well for handling node failures. Furthermore, we propose novel techniques that leverage keyword synonyms, ontologies and caching to enhance end-user experience. This paper reports several experiments to comprehensively study CoMaP's performance. The results demonstrate CoMaP's benefits as a scalable distributed mashup platform.

  14. Random distributed feedback fibre lasers

    NASA Astrophysics Data System (ADS)

    Turitsyn, Sergei K.; Babin, Sergey A.; Churkin, Dmitry V.; Vatnik, Ilya D.; Nikulin, Maxim; Podivilov, Evgenii V.

    2014-09-01

    The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with “negative absorption” of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100 km. Although an effective reflection due to the Rayleigh scattering is extremely small (˜0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation

  15. The neutron star mass distribution

    SciTech Connect

    Kiziltan, Bülent; Kottas, Athanasios; De Yoreo, Maria; Thorsett, Stephen E.

    2013-11-20

    In recent years, the number of pulsars with secure mass measurements has increased to a level that allows us to probe the underlying neutron star (NS) mass distribution in detail. We critically review the radio pulsar mass measurements. For the first time, we are able to analyze a sizable population of NSs with a flexible modeling approach that can effectively accommodate a skewed underlying distribution and asymmetric measurement errors. We find that NSs that have evolved through different evolutionary paths reflect distinctive signatures through dissimilar distribution peak and mass cutoff values. NSs in double NS and NS-white dwarf (WD) systems show consistent respective peaks at 1.33 M {sub ☉} and 1.55 M {sub ☉}, suggesting significant mass accretion (Δm ≈ 0.22 M {sub ☉}) has occurred during the spin-up phase. The width of the mass distribution implied by double NS systems is indicative of a tight initial mass function while the inferred mass range is significantly wider for NSs that have gone through recycling. We find a mass cutoff at ∼2.1 M {sub ☉} for NSs with WD companions, which establishes a firm lower bound for the maximum NS mass. This rules out the majority of strange quark and soft equation of state models as viable configurations for NS matter. The lack of truncation close to the maximum mass cutoff along with the skewed nature of the inferred mass distribution both enforce the suggestion that the 2.1 M {sub ☉} limit is set by evolutionary constraints rather than nuclear physics or general relativity, and the existence of rare supermassive NSs is possible.

  16. Centaur size distribution with DECam

    NASA Astrophysics Data System (ADS)

    Fuentes, Cesar; Trilling, David E.; Schlichting, Hilke

    2014-11-01

    We present the results of the 2014 centaur search campaign on the Dark Energy Camera (DECam) in Tololo, Chile. This is the largest debiased Centaur survey to date, measuring for the first time the size distribution of small Centaurs (1-10km) and the first time the sizes of planetesimals from which the entire Solar System formed are directly detected.The theoretical model for the coagulation and collisional evolution of the outer solar system proposed in Schlichting et al. 2013 predicts a steep rise in the size distribution of TNOs smaller than 10km. These objects are below the detection limit of current TNO surveys but feasible for the Centaur population. By constraining the number of Centaurs and this feature in their size distribution we can confirm the collisional evolution of the Solar System and estimate the rate at which material is being transferred from the outer to the inner Solar System. If the shallow power law behavior from the TNO size distribution at ~40km can be extrapolated to 1km, the size of the Jupiter Family of Comets (JFC), there would not be enough small TNOs to supply the JFC population (Volk & Malhotra, 2008), debunking the link between TNOs and JFCs.We also obtain the colors of small Centaurs and TNOs, providing a signature of collisional evolution by measuring if there is in fact a relationship between color and size. If objects smaller than the break in the TNO size distribution are being ground down by collisions then their surfaces should be fresh, and then appear bluer in the optical than larger TNOs that are not experiencing collisions.

  17. Distributed data analysis in ATLAS

    NASA Astrophysics Data System (ADS)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  18. Gaussian Velocity Distributions in Avalanches

    NASA Astrophysics Data System (ADS)

    Shattuck, Mark

    2004-03-01

    Imagine a world where gravity is so strong that if an ice cube is tilted the shear forces melt the surface and water avalanches down. Further imagine that the ambient temperature is so low that the water re-freezes almost immediately. This is the world of granular flows. As a granular solid is tilted the surface undergoes a sublimation phase transition and a granular gas avalanches down the surface, but the inelastic collisions rapidly remove energy from the flow lowering the granular temperature (kinetic energy per particle) until the gas solidifies again. It is under these extreme conditions that we attempt to uncover continuum granular flow properties. Typical continuum theories like Navier-Stokes equation for fluids follow the space-time evolution of the first few moments of the velocity distribution. We study continuously avalanching flow in a rotating two-dimensional granular drum using high-speed video imaging and extract the position and velocities of the particles. We find a universal near Gaussian velocity distribution throughout the flowing regions, which are characterized by a liquid-like radial distribution function. In the remaining regions, in which the radial distribution function develops sharp crystalline peaks, the velocity distribution has a Gaussian peak but is much broader in the tails. In a companion experiment on a vibrated two-dimensional granular fluid under constant pressure, we find a clear gas-solid phase transition in which both the temperature and density change discontinuously. This suggests that a low temperature crystal and a high temperature gas can coexist in steady state. This coexistence could result in a narrower, cooler, Gaussian peak and a broader, warmer, Gaussian tail like the non-Gaussian behavior seen in the crystalline portions of the rotating drum.

  19. Distributed intelligence in an astronomical Distributed Sensor Network

    NASA Astrophysics Data System (ADS)

    White, R. R.; Davis, H.; Vestrand, W. T.; Wozniak, P. R.

    2008-03-01

    The Telescope Alert Operations Network System (TALONS) was designed and developed in the year 2000, around the architectural principles of a distributed sensor network. This network supported the original Rapid Telescopes for Optical Response (RAPTOR) project goals; however, only with further development could TALONS meet the goals of the larger Thinking Telescope Project. The complex objectives of the Thinking Telescope project required a paradigm shift in the software architecture - the centralised intelligence merged into the TALONS network operations could no longer meet all of the new requirements. The intelligence needed to be divorced from the network operations and developed as a series of peripheral intelligent agents, distributing the decision making and analytical processes based on the temporal volatility of the data. This paper is presented as only one part of the poster from the workshop and in it we will explore the details of this architecture and how that merges with the current Thinking Telescope system to meet our project goals.

  20. A heuristic for efficient data distribution management in distributed simulation

    NASA Astrophysics Data System (ADS)

    Gupta, Pankaj; Guha, Ratan K.

    2005-05-01

    In this paper, we propose an algorithm for reducing the complexity of region matching and efficient multicasting in data distribution management component of High Level Architecture (HLA) Run Time Infrastructure (RTI). The current data distribution management (DDM) techniques rely on computing the intersection between the subscription and update regions. When a subscription region and an update region of different federates overlap, RTI establishes communication between the publisher and the subscriber. It subsequently routes the updates from the publisher to the subscriber. The proposed algorithm computes the update/subscription regions matching for dynamic allocation of multicast group. It provides new multicast routines that exploit the connectivity of federation by communicating updates regarding interactions and routes information only to those federates that require them. The region-matching problem in DDM reduces to clique-covering problem using the connections graph abstraction where the federations represent the vertices and the update/subscribe relations represent the edges. We develop an abstract model based on connection graph for data distribution management. Using this abstract model, we propose a heuristic for solving the region-matching problem of DDM. We also provide complexity analysis of the proposed heuristics.

  1. Mixture of Skewed α-Stable Distributions

    NASA Astrophysics Data System (ADS)

    Shojaei, S. R. Hosseini; Nassiri, V.; Mohammadian, Gh. R.; Mohammadpour, A.

    2011-03-01

    Expectation maximization (EM) algorithm and the Bayesian techniques are two approaches for statistical inference of mixture models [3, 4]. By noting the advantages of the Bayesian methods, practitioners prefer them. However, implementing Markov chain Monte Carlo algorithms can be very complicated for stable distributions, due to the non-analytic density or distribution function formulas. In this paper, we introduce a new class of mixture of heavy-tailed distributions, called mixture of skewed stable distributions. Skewed stable distributions belongs to the exponential family and they have analytic density representation. It is shown that skewed stable distributions dominate skew stable distribution functions and they can be used to model heavy-tailed data. The class of skewed stable distributions has an analytic representation for its density function and the Bayesian inference can be done similar to the exponential family of distributions. Finally, mixture of skewed stable distributions are compared to the mixture of stable distributions through a simulations study.

  2. 26 CFR 1.652(a)-2 - Distributions in excess of distributable net income.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Distributions in excess of distributable net... Only § 1.652(a)-2 Distributions in excess of distributable net income. If the amount of income required to be distributed currently to beneficiaries exceeds the distributable net income of the trust...

  3. Distribution Integrity Management Plant (DIMP)

    SciTech Connect

    Gonzales, Jerome F.

    2012-05-07

    This document is the distribution integrity management plan (Plan) for the Los Alamos National Laboratory (LANL) Natural Gas Distribution System. This Plan meets the requirements of 49 CFR Part 192, Subpart P Distribution Integrity Management Programs (DIMP) for the LANL Natural Gas Distribution System. This Plan was developed by reviewing records and interviewing LANL personnel. The records consist of the design, construction, operation and maintenance for the LANL Natural Gas Distribution System. The records system for the LANL Natural Gas Distribution System is limited, so the majority of information is based on the judgment of LANL employees; the maintenance crew, the Corrosion Specialist and the Utilities and Infrastructure (UI) Civil Team Leader. The records used in this report are: Pipeline and Hazardous Materials Safety Administration (PHMSA) 7100.1-1, Report of Main and Service Line Inspection, Natural Gas Leak Survey, Gas Leak Response Report, Gas Leak and Repair Report, and Pipe-to-Soil Recordings. The specific elements of knowledge of the infrastructure used to evaluate each threat and prioritize risks are listed in Sections 6 and 7, Threat Evaluation and Risk Prioritization respectively. This Plan addresses additional information needed and a method for gaining that data over time through normal activities. The processes used for the initial assessment of Threat Evaluation and Risk Prioritization are the methods found in the Simple, Handy Risk-based Integrity Management Plan (SHRIMP{trademark}) software package developed by the American Pipeline and Gas Agency (APGA) Security and Integrity Foundation (SIF). SHRIMP{trademark} uses an index model developed by the consultants and advisors of the SIF. Threat assessment is performed using questions developed by the Gas Piping Technology Company (GPTC) as modified and added to by the SHRIMP{trademark} advisors. This Plan is required to be reviewed every 5 years to be continually refined and improved. Records

  4. Nation Radiobiology Archives Distributed Access

    SciTech Connect

    Smith, S. K.; Prather, J. C.; Ligotke, E. K.; Watson, C. R.

    1992-06-01

    NRADA1.1 is a supplement to NRADA1.0. This version eliminates several bugs, and includes a few new features. The diskettes consist of a distributed subset of information representative of the extensive NRA databases and database access software maintained at the Pacific Northwest Laboratory which provide an introduction to the scope and style of the NRA Information Systems. Information in the NRA Summary, Inventory, and Bibliographic database is available upon request. Printed reports have been provided in the past. The completion of the NRADA1.1 is the realization of a long standing goal of the staff and advisory committee. Information may be easily distributed to the user in an electronic form which preserves the relationships between the various databases.

  5. Hazards Data Distribution System (HDDS)

    USGS Publications Warehouse

    Jones, Brenda; Lamb, Rynn

    2015-01-01

    When emergencies occur, first responders and disaster response teams often need rapid access to aerial photography and satellite imagery that is acquired before and after the event. The U.S. Geological Survey (USGS) Hazards Data Distribution System (HDDS) provides quick and easy access to pre- and post-event imagery and geospatial datasets that support emergency response and recovery operations. The HDDS provides a single, consolidated point-of-entry and distribution system for USGS-hosted remotely sensed imagery and other geospatial datasets related to an event response. The data delivery services are provided through an interactive map-based interface that allows emergency response personnel to rapidly select and download pre-event ("baseline") and post-event emergency response imagery.

  6. Models for the hotspot distribution

    SciTech Connect

    Jurdy, D.M. ); Stefanick, M. )

    1990-10-01

    Published hotspot catalogues all show a hemispheric concentration beyond what can be expected by chance. Cumulative distributions about the center of concentration are described by a power law with a fractal dimension closer to 1 than 2. Random sets of the corresponding sizes do not show this effect. A simple shift of the random sets away from a point would produce distributions similar to those of hotspot sets. The possible relation of the hotspots to the locations of ridges and subduction zones is tested using large sets of randomly-generated points to estimate areas within given distances of the plate boundaries. The probability of finding the observed number of hotspots within 10 of the ridges is about what is expected.

  7. Nation Radiobiology Archives Distributed Access

    Energy Science and Technology Software Center (ESTSC)

    1992-06-01

    NRADA1.1 is a supplement to NRADA1.0. This version eliminates several bugs, and includes a few new features. The diskettes consist of a distributed subset of information representative of the extensive NRA databases and database access software maintained at the Pacific Northwest Laboratory which provide an introduction to the scope and style of the NRA Information Systems. Information in the NRA Summary, Inventory, and Bibliographic database is available upon request. Printed reports have been provided inmore » the past. The completion of the NRADA1.1 is the realization of a long standing goal of the staff and advisory committee. Information may be easily distributed to the user in an electronic form which preserves the relationships between the various databases.« less

  8. A global distributed storage architecture

    NASA Technical Reports Server (NTRS)

    Lionikis, Nemo M.; Shields, Michael F.

    1996-01-01

    NSA architects and planners have come to realize that to gain the maximum benefit from, and keep pace with, emerging technologies, we must move to a radically different computing architecture. The compute complex of the future will be a distributed heterogeneous environment, where, to a much greater extent than today, network-based services are invoked to obtain resources. Among the rewards of implementing the services-based view are that it insulates the user from much of the complexity of our multi-platform, networked, computer and storage environment and hides its diverse underlying implementation details. In this paper, we will describe one of the fundamental services being built in our envisioned infrastructure; a global, distributed archive with near-real-time access characteristics. Our approach for adapting mass storage services to this infrastructure will become clear as the service is discussed.

  9. Estimators for the Cauchy distribution

    SciTech Connect

    Hanson, K.M.; Wolf, D.R.

    1993-12-31

    We discuss the properties of various estimators of the central position of the Cauchy distribution. The performance of these estimators is evaluated for a set of simulated experiments. Estimators based on the maximum and mean of the posterior probability density function are empirically found to be well behaved when more than two measurements are available. On the contrary, because of the infinite variance of the Cauchy distribution, the average of the measured positions is an extremely poor estimator of the location of the source. However, the median of the measured positions is well behaved. The rms errors for the various estimators are compared to the Fisher-Cramer-Rao lower bound. We find that the square root of the variance of the posterior density function is predictive of the rms error in the mean posterior estimator.

  10. Conformation Distributions in Adsorbed Proteins.

    NASA Astrophysics Data System (ADS)

    Meuse, Curtis W.; Hubbard, Joseph B.; Vrettos, John S.; Smith, Jackson R.; Cicerone, Marcus T.

    2007-03-01

    While the structural basis of protein function is well understood in the biopharmaceutical and biotechnology industries, few methods for the characterization and comparison of protein conformation distributions are available. New methods capable of measuring the stability of protein conformations and the integrity of protein-protein, protein-ligand and protein-surface interactions both in solution and on surfaces are needed to help the development of protein-based products. We are developing infrared spectroscopy methods for the characterization and comparison of molecular conformation distributions in monolayers and in solutions. We have extracted an order parameter describing the orientational and conformational variations of protein functional groups around the average molecular values from a single polarized spectrum. We will discuss the development of these methods and compare them to amide hydrogen/deuterium exchange methods for albumin in solution and on different polymer surfaces to show that our order parameter is related to protein stability.

  11. Distributed antenna system and method

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Dobbins, Justin A. (Inventor)

    2004-01-01

    System and methods are disclosed for employing one or more radiators having non-unique phase centers mounted to a body with respect to a plurality of transmitters to determine location characteristics of the body such as the position and/or attitude of the body. The one or more radiators may consist of a single, continuous element or of two or more discrete radiation elements whose received signals are combined. In a preferred embodiment, the location characteristics are determined using carrier phase measurements whereby phase center information may be determined or estimated. A distributed antenna having a wide angle view may be mounted to a moveable body in accord with the present invention. The distributed antenna may be utilized for maintaining signal contact with multiple spaced apart transmitters, such as a GPS constellation, as the body rotates without the need for RF switches to thereby provide continuous attitude and position determination of the body.

  12. Antenna structure with distributed strip

    SciTech Connect

    Rodenbeck, Christopher T.

    2008-10-21

    An antenna comprises electrical conductors arranged to form a radiating element including a folded line configuration and a distributed strip configuration, where the radiating element is in proximity to a ground conductor. The folded line and the distributed strip can be electrically interconnected and substantially coplanar. The ground conductor can be spaced from, and coplanar to, the radiating element, or can alternatively lie in a plane set at an angle to the radiating element. Embodiments of the antenna include conductor patterns formed on a printed wiring board, having a ground plane, spacedly adjacent to and coplanar with the radiating element. Other embodiments of the antenna comprise a ground plane and radiating element on opposed sides of a printed wiring board. Other embodiments of the antenna comprise conductors that can be arranged as free standing "foils". Other embodiments include antennas that are encapsulated into a package containing the antenna.

  13. Antenna structure with distributed strip

    SciTech Connect

    Rodenbeck, Christopher T.

    2008-03-18

    An antenna comprises electrical conductors arranged to form a radiating element including a folded line configuration and a distributed strip configuration, where the radiating element is in proximity to a ground conductor. The folded line and the distributed strip can be electrically interconnected and substantially coplanar. The ground conductor can be spaced from, and coplanar to, the radiating element, or can alternatively lie in a plane set at an angle to the radiating element. Embodiments of the antenna include conductor patterns formed on a printed wiring board, having a ground plane, spacedly adjacent to and coplanar with the radiating element. Other embodiments of the antenna comprise a ground plane and radiating element on opposed sides of a printed wiring board. Other embodiments of the antenna comprise conductors that can be arranged as free standing "foils". Other embodiments include antennas that are encapsulated into a package containing the antenna.

  14. Digitally controlled distributed phase shifter

    SciTech Connect

    Hietala, V.M.; Kravitz, S.H.; Vawter, G.A.

    1992-12-31

    A digitally controlled distributed phase shifter is comprised of N phase shifters. Digital control is achieved by using N binary length-weighted electrodes located on the top surface of a waveguide. A control terminal is attached to each electrode thereby allowing the application of a control signal. The control signal is either one of two discrete bias voltages. The application of the discrete bias voltages change the modal index of a portion of the waveguide that corresponds to a length of the electrode to which the bias voltage is applied, thereby causing the phase to change through the underlying portion of the waveguide. The digitally controlled distributed phase shift network has a total phase shift comprised of the sum of the individual phase shifters.

  15. Secure key storage and distribution

    DOEpatents

    Agrawal, Punit

    2015-06-02

    This disclosure describes a distributed, fault-tolerant security system that enables the secure storage and distribution of private keys. In one implementation, the security system includes a plurality of computing resources that independently store private keys provided by publishers and encrypted using a single security system public key. To protect against malicious activity, the security system private key necessary to decrypt the publication private keys is not stored at any of the computing resources. Rather portions, or shares of the security system private key are stored at each of the computing resources within the security system and multiple security systems must communicate and share partial decryptions in order to decrypt the stored private key.

  16. Enhanced distributed energy resource system

    DOEpatents

    Atcitty, Stanley; Clark, Nancy H.; Boyes, John D.; Ranade, Satishkumar J.

    2007-07-03

    A power transmission system including a direct current power source electrically connected to a conversion device for converting direct current into alternating current, a conversion device connected to a power distribution system through a junction, an energy storage device capable of producing direct current connected to a converter, where the converter, such as an insulated gate bipolar transistor, converts direct current from an energy storage device into alternating current and supplies the current to the junction and subsequently to the power distribution system. A microprocessor controller, connected to a sampling and feedback module and the converter, determines when the current load is higher than a set threshold value, requiring triggering of the converter to supply supplemental current to the power transmission system.

  17. Distributed Supervisory Protection Interlock System

    SciTech Connect

    Walz, H.V.; Agostini, R.C.; Barker, L.; Cherkassky, R.; Constant, T.; Matheson, R.

    1989-03-01

    The Distributed Supervisory Protection Interlock System, DSPI, is under development at the Stanford Linear Accelerator Center for requirements in the areas of personnel protection, beam containment and equipment protection interlocks. The DSPI system, distributed over the application site, consists of segments with microprocessor-based controller and I/O modules, local area networks for communication, and a global supervisor computer. Segments are implemented with commercially available controller and I/O modules arranged in local interlock clusters, and associated software. Segments provide local interlock data acquisition, processing and control. Local area networks provide the communication backbone between segments and a global supervisor processor. The supervisor processor monitors the overall system, reports detail status and provides human interfaces. Details of an R and D test system, which will implement the requirements for personnel protection of 4 typical linear accelerator sectors, will be described. 4 refs., 2 figs.

  18. SAMICS marketing and distribution model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A SAMICS (Solar Array Manufacturing Industry Costing Standards) was formulated as a computer simulation model. Given a proper description of the manufacturing technology as input, this model computes the manufacturing price of solar arrays for a broad range of production levels. This report presents a model for computing these marketing and distribution costs, the end point of the model being the loading dock of the final manufacturer.

  19. Distribution-free discriminant analysis

    SciTech Connect

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  20. Power management and distribution technology

    NASA Technical Reports Server (NTRS)

    Dickman, John Ellis

    1993-01-01

    Power management and distribution (PMAD) technology is discussed in the context of developing working systems for a piloted Mars nuclear electric propulsion (NEP) vehicle. The discussion is presented in vugraph form. The following topics are covered: applications and systems definitions; high performance components; the Civilian Space Technology Initiative (CSTI) high capacity power program; fiber optic sensors for power diagnostics; high temperature power electronics; 200 C baseplate electronics; high temperature component characterization; a high temperature coaxial transformer; and a silicon carbide mosfet.

  1. Charge Distribution in Mesospheric Clouds

    SciTech Connect

    Misra, Shikha; Mishra, S. K.; Sodha, M. S.

    2011-11-29

    This work presents an analytical model for the physical understanding of the charge distribution on pure (with high work function) and dirty (with low work function) ice dust particles in polar mesospheric clouds PMCs (NLCs and PMSEs). The analysis is based on number and energy balance of constituents and allows the charge to be only an integral multiple (positive or negative) of the electronic charge.

  2. The data distribution satellite system

    NASA Technical Reports Server (NTRS)

    Bruno, Ronald C.; Weinberg, Aaron

    1991-01-01

    The Data Distributed Satellite (DDS) will be capable of providing the space research community with inexpensive and easy access to space payloads and space data. Furthermore, the DDS is shown to be a natural outgrowth of advances and evolution in both NASA's Space Network and commercial satellite communications. The roadmap and timescale for this evolution is described along with key demonstrations, proof-of-concept models, and required technology development that will support the projected system evolution toward the DDS.

  3. Small Aircraft Data Distribution System

    NASA Technical Reports Server (NTRS)

    Chazanoff, Seth L.; Dinardo, Steven J.

    2012-01-01

    The CARVE Small Aircraft Data Distribution System acquires the aircraft location and attitude data that is required by the various programs running on a distributed network. This system distributes the data it acquires to the data acquisition programs for inclusion in their data files. It uses UDP (User Datagram Protocol) to broadcast data over a LAN (Local Area Network) to any programs that might have a use for the data. The program is easily adaptable to acquire additional data and log that data to disk. The current version also drives displays using precision pitch and roll information to aid the pilot in maintaining a level-level attitude for radar/radiometer mapping beyond the degree available by flying visually or using a standard gyro-driven attitude indicator. The software is designed to acquire an array of data to help the mission manager make real-time decisions as to the effectiveness of the flight. This data is displayed for the mission manager and broadcast to the other experiments on the aircraft for inclusion in their data files. The program also drives real-time precision pitch and roll displays for the pilot and copilot to aid them in maintaining the desired attitude, when required, during data acquisition on mapping lines.

  4. CEBAF Distributed Data Acquisition System

    SciTech Connect

    Trent Allison; Thomas Powers

    2005-05-01

    There are thousands of signals distributed throughout Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF) that are useful for troubleshooting and identifying instabilities. Many of these signals are only available locally or monitored by systems with small bandwidths that cannot identify fast transients. The Distributed Data Acquisition (Dist DAQ) system will sample and record these signals simultaneously at rates up to 40 Msps. Its primary function will be to provide waveform records from signals throughout CEBAF to the Experimental Physics and Industrial Control System (EPICS). The waveforms will be collected after the occurrence of an event trigger. These triggers will be derived from signals such as periodic timers or accelerator faults. The waveform data can then be processed to quickly identify beam transport issues, thus reducing down time and increasing CEBAF performance. The Dist DAQ system will be comprised of multiple standalone chassis distributed throughout CEBAF. They will be interconnected via a fiber optic network to facilitate the global triggering of events. All of the chassis will also be connected directly to the CEBAF Ethernet and run EPICS locally. This allows for more flexibility than the typical configuration of a single board computer and other custom printed circuit boards (PCB) installed in a card cage.

  5. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, Clifford B.

    1995-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  6. Distributed Relaxation for Conservative Discretizations

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2001-01-01

    A multigrid method is defined as having textbook multigrid efficiency (TME) if the solutions to the governing system of equations are attained in a computational work that is a small (less than 10) multiple of the operation count in one target-grid residual evaluation. The way to achieve this efficiency is the distributed relaxation approach. TME solvers employing distributed relaxation have already been demonstrated for nonconservative formulations of high-Reynolds-number viscous incompressible and subsonic compressible flow regimes. The purpose of this paper is to provide foundations for applications of distributed relaxation to conservative discretizations. A direct correspondence between the primitive variable interpolations for calculating fluxes in conservative finite-volume discretizations and stencils of the discretized derivatives in the nonconservative formulation has been established. Based on this correspondence, one can arrive at a conservative discretization which is very efficiently solved with a nonconservative relaxation scheme and this is demonstrated for conservative discretization of the quasi one-dimensional Euler equations. Formulations for both staggered and collocated grid arrangements are considered and extensions of the general procedure to multiple dimensions are discussed.

  7. Distributed Virtual System (DIVIRS) Project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on contract NCC 2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to program parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the virtual system model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  8. Redshifts distribution in A262

    NASA Astrophysics Data System (ADS)

    Hassan, M. S. R.; Abidin, Z. Z.; Ibrahim, U. F. S. U.; Hashim, N.; Lee, D. A. A.

    2016-05-01

    Galaxy clusters are the largest virialized systems in the Universe containing a collection of galaxies of different redshifts. The redshift distribution of galaxies in galaxy clusters is concentrated at a certain redshift range which remarkably tells us that only the galaxies in a certain radial range belong to the galaxy cluster. This leads to a boundary estimation of the cluster. Background and foreground systems are represented by a histogram that determines whether some of the galaxies are too far or too high in redshift to be counted as the member of the cluster. With the recent advances in multifibre spectroscopy, it has become possible to perform detailed analysis of the redshift distribution of several galaxy clusters in the Abell Catalogue. This has given rise to significantly improved estimates of cluster membership, extent and dynamical history. Here we present a spectroscopic analysis of the galaxy cluster A262. We find 55 galaxies fall within z = 0.0143 and 0.0183 with velocity range 4450-5300 km s-1, and are therefore members of the cluster. We derived a new mean redshift of z = 0.016 173 ± 0.000 074 (4852 ± 22 km s-1) for the system of which we compare with our neutral hydrogen (H I) detection which peaks at 4970 ± 0.5 km s-1. It is found that the distribution of H I tends to be located at the edge of the cluster since most of spiral rich galaxies were away from cluster centre.

  9. Energy conservation in electric distribution

    SciTech Connect

    Lee, Chong-Jin

    1994-12-31

    This paper discusses the potential for energy and power savings that exist in electric power delivery systems. These savings translate into significant financial and environmental benefits for electricity producers and consumers as well as for society in general. AlliedSignal`s knowledge and perspectives on this topic are the result of discussions with hundreds of utility executives, government officials and other industry experts over the past decade in conjunction with marketing our Amorphous Metal technology for electric distribution transformers. Amorphous metal is a technology developed by AlliedSignal that significantly reduces the energy lost in electric distribution transformers at an incremental cost of just a few cents per kilo-Watt-hour. The purpose of this paper is to discuss: Amorphous Metal Alloy Technology; Energy Savings Opportunity; The Industrial Barriers and Remedies; Worldwide Demand; and A Low Risk Strategy. I wish this presentation will help KEPCO achieve their stated aims of ensuring sound development of the national economy and enhancement of public life through the economic and stable supply of electric power. AlliedSignal Korea Ltd. in conjunction with AlliedSignal Amorphous Metals in the U.S. are here to work with KEPCO, transformer manufacturers, industry, and government agencies to achieve greater efficiency in power distribution.

  10. Jefferson Lab's Distributed Data Acquisition

    SciTech Connect

    Trent Allison; Thomas Powers

    2006-05-01

    Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF) occasionally experiences fast intermittent beam instabilities that are difficult to isolate and result in downtime. The Distributed Data Acquisition (Dist DAQ) system is being developed to detect and quickly locate such instabilities. It will consist of multiple Ethernet based data acquisition chassis distributed throughout the seven-eights of a mile CEBAF site. Each chassis will monitor various control system signals that are only available locally and/or monitored by systems with small bandwidths that cannot identify fast transients. The chassis will collect data at rates up to 40 Msps in circular buffers that can be frozen and unrolled after an event trigger. These triggers will be derived from signals such as periodic timers or accelerator faults and be distributed via a custom fiber optic event trigger network. This triggering scheme will allow all the data acquisition chassis to be triggered simultaneously and provide a snapshot of relevant CEBAF control signals. The data will then be automatically analyzed for frequency content and transients to determine if and where instabilities exist.

  11. Probability distributions of turbulent energy.

    PubMed

    Momeni, Mahdi; Müller, Wolf-Christian

    2008-05-01

    Probability density functions (PDFs) of scale-dependent energy fluctuations, P[deltaE(l)] , are studied in high-resolution direct numerical simulations of Navier-Stokes and incompressible magnetohydrodynamic (MHD) turbulence. MHD flows with and without a strong mean magnetic field are considered. For all three systems it is found that the PDFs of inertial range energy fluctuations exhibit self-similarity and monoscaling in agreement with recent solar-wind measurements [Hnat, Geophys. Res. Lett. 29, 86 (2002)]. Furthermore, the energy PDFs exhibit similarity over all scales of the turbulent system showing no substantial qualitative change of shape as the scale of the fluctuations varies. This is in contrast to the well-known behavior of PDFs of turbulent velocity fluctuations. In all three cases under consideration the P[deltaE(l)] resemble Lévy-type gamma distributions approximately Delta;{-1} exp(-|deltaE|/Delta)|deltaE|;{-gamma} The observed gamma distributions exhibit a scale-dependent width Delta(l) and a system-dependent gamma . The monoscaling property reflects the inertial-range scaling of the Elsässer-field fluctuations due to lacking Galilei invariance of deltaE . The appearance of Lévy distributions is made plausible by a simple model of energy transfer. PMID:18643170

  12. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1994-01-01

    As outlined in our continuation proposal 92-ISI-. OR (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  13. Antiquark distributions in the proton

    SciTech Connect

    Brooks, M.; Carey, T.; Garvey, G.

    1997-07-01

    This is the final report of a three-year Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The study of quark and antiquark distributions in the nucleon has been a major endeavor in nuclear and particle physics. Results from a recent deep-inelastic scattering experiment suggest the surprising possibility that the up and down antiquark distributions in the proton are not symmetric. A sensitive and direct determination of the antiquark distributions in the proton can be made by comparing the Drell-Yan cross sections on hydrogen versus deuterium targets. The authors have proposed a new experiment (E866) at Fermilab to carry out such measurements. E866 has been taking data since September 1996. Preliminary results show that the apparatus is working very well. The authors anticipate having seven months of beam in 1997, which would allow them to achieve the sensitivities for a definitive measurement of flavor symmetry of sea quarks in the proton.

  14. Preliminary Iron Distribution on Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Mittlefehldt, David W.

    2013-01-01

    The distribution of iron on the surface of the asteroid Vesta was investigated using Dawn's Gamma Ray and Neutron Detector (GRaND) [1,2]. Iron varies predictably with rock type for the howardite, eucrite, and diogenite (HED) meteorites, thought to be representative of Vesta. The abundance of Fe in howardites ranges from about 12 to 15 wt.%. Basaltic eucrites have the highest abundance, whereas, lower crustal and upper mantle materials (cumulate eucrites and diogenites) have the lowest, and howardites are intermediate [3]. We have completed a mapping study of 7.6 MeV gamma rays produced by neutron capture by Fe as measured by the bismuth germanate (BGO) detector of GRaND [1]. The procedures to determine Fe counting rates are presented in detail here, along with a preliminary distribution map, constituting the necessary initial step to quantification of Fe abundances. We find that the global distribution of Fe counting rates is generally consistent with independent mineralogical and compositional inferences obtained by other instruments on Dawn such as measurements of pyroxene absorption bands by the Visual and Infrared Spectrometer (VIR) [4] and Framing Camera (FC) [5] and neutron absorption measurements by GRaND [6].

  15. Distributed Virtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in the continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC 2-539, the investigators are developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; developing communications routines that support the abstractions implemented; continuing the development of file and information systems based on the Virtual System Model; and incorporating appropriate security measures to allow the mechanisms developed to be used on an open network. The goal throughout the work is to provide a uniform model that can be applied to both parallel and distributed systems. The authors believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. The work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  16. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  17. Infrastructure for distributed enterprise simulation

    SciTech Connect

    Johnson, M.M.; Yoshimura, A.S.; Goldsby, M.E.

    1998-01-01

    Traditional discrete-event simulations employ an inherently sequential algorithm and are run on a single computer. However, the demands of many real-world problems exceed the capabilities of sequential simulation systems. Often the capacity of a computer`s primary memory limits the size of the models that can be handled, and in some cases parallel execution on multiple processors could significantly reduce the simulation time. This paper describes the development of an Infrastructure for Distributed Enterprise Simulation (IDES) - a large-scale portable parallel simulation framework developed to support Sandia National Laboratories` mission in stockpile stewardship. IDES is based on the Breathing-Time-Buckets synchronization protocol, and maps a message-based model of distributed computing onto an object-oriented programming model. IDES is portable across heterogeneous computing architectures, including single-processor systems, networks of workstations and multi-processor computers with shared or distributed memory. The system provides a simple and sufficient application programming interface that can be used by scientists to quickly model large-scale, complex enterprise systems. In the background and without involving the user, IDES is capable of making dynamic use of idle processing power available throughout the enterprise network. 16 refs., 14 figs.

  18. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    A number of topics related to building a generalized distributed system model are discussed. The effects of distributed database modeling on evaluation of transaction rollbacks, the measurement of effects of distributed database models on transaction availability measures, and a performance analysis of static locking in replicated distributed database systems are covered.

  19. 21 CFR 820.160 - Distribution.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Distribution. 820.160 Section 820.160 Food and... QUALITY SYSTEM REGULATION Handling, Storage, Distribution, and Installation § 820.160 Distribution. (a) Each manufacturer shall establish and maintain procedures for control and distribution of...

  20. 40 CFR 152.132 - Supplemental distribution.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Supplemental distribution. 152.132... Supplemental distribution. The registrant may distribute or sell his registered product under another person's name and address instead of (or in addition to) his own. Such distribution and sale is...

  1. 21 CFR 820.160 - Distribution.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Distribution. 820.160 Section 820.160 Food and... QUALITY SYSTEM REGULATION Handling, Storage, Distribution, and Installation § 820.160 Distribution. (a) Each manufacturer shall establish and maintain procedures for control and distribution of...

  2. 40 CFR 152.132 - Supplemental distribution.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Supplemental distribution. 152.132... Supplemental distribution. The registrant may distribute or sell his registered product under another person's name and address instead of (or in addition to) his own. Such distribution and sale is...

  3. 30 CFR 57.12006 - Distribution boxes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Distribution boxes. 57.12006 Section 57.12006... and Underground § 57.12006 Distribution boxes. Distribution boxes shall be provided with a... deenergized, and the distribution box shall be labeled to show which circuit each device controls....

  4. 30 CFR 57.12006 - Distribution boxes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Distribution boxes. 57.12006 Section 57.12006... and Underground § 57.12006 Distribution boxes. Distribution boxes shall be provided with a... deenergized, and the distribution box shall be labeled to show which circuit each device controls....

  5. Television Distribution System for Primary Schools.

    ERIC Educational Resources Information Center

    South Australia Education Dept., Adelaide.

    This report covers a 12-month study and actual trial of a video distribution system for a primary school. It consisted of a main aerial distribution into a distribution junction box which also took video cassette recorders. The whole system was designed to distribute both in RF and video frequencies. Some ways of using the system have also been…

  6. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  7. Statistical Physics for Adaptive Distributed Control

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.

  8. M-protein gene-type distribution and hyaluronic acid capsule in group A Streptococcus clinical isolates in Chile: association of emm gene markers with csrR alleles.

    PubMed

    Wozniak, A; Rojas, P; Rodríguez, C; Undabarrena, A; Garate, C; Riedel, I; Román, J C; Kalergis, A M; García, P

    2012-07-01

    Streptococcus pyogenes causes a variety of infections because of virulence factors such as capsular hyaluronic acid and M protein. The aim of this study was to determine emm types and capsule phenotype in 110 isolates of S. pyogenes from patients with invasive (sterile sites) and non-invasive (mainly pharyngitis) infections in Chile, and the relationship between both virulence factors. The most abundant types found were emm12, emm1, emm4 and emm28 and their distribution was similar to that seen in Latin America and developed countries, but very different from that in Asia and Pacific Island countries. Ten of 16 emm types identified in pharyngeal isolates were found in sterile-site isolates, and three of nine emm types of sterile-site isolates occurred in pharyngeal isolates; three emm subtypes were novel. The amount of hyaluronic acid was significantly higher in sterile-site isolates but did not differ substantially among emm types. Only three isolates were markedly capsulate and two of them had mutations in the csrR gene that codes for a repressor of capsule synthesis genes. We found a non-random association between emm types and csrR gene alleles suggesting that horizontal gene transfer is not freely occurring in the population. PMID:21906413

  9. Problem solving in a distributed environment

    NASA Technical Reports Server (NTRS)

    Rashid, R. F.

    1980-01-01

    Distributed problem solving is anayzed as a blend of two disciplines: (1) problem solving and ai; and (2) distributed systems (monitoring). It may be necessary to distribute because the application itself is one of managing distributed resources (e.g., distributed sensor net) and communication delays preclude centralized processing, or it may be desirable to distribute because a single computational engine may not satisfy the needs of a given task. In addition, considerations of reliability may dictate distribution. Examples of multi-process language environment are given.

  10. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  11. A distributed clients/distributed servers model for STARCAT

    NASA Technical Reports Server (NTRS)

    Pirenne, B.; Albrecht, M. A.; Durand, D.; Gaudet, S.

    1992-01-01

    STARCAT, the Space Telescope ARchive and CATalogue user interface has been along for a number of years already. During this time it has been enhanced and augmented in a number of different fields. This time, we would like to dwell on a new capability allowing geographically distributed user interfaces to connect to geographically distributed data servers. This new concept permits users anywhere on the internet running STARCAT on their local hardware to access e.g., whichever of the 3 existing HST archive sites is available, or get information on the CFHT archive through a transparent connection to the CADC in BC or to get the La Silla weather by connecting to the ESO database in Munich during the same session. Similarly PreView (or quick look) images and spectra will also flow directly to the user from wherever it is available. Moving towards an 'X'-based STARCAT is another goal being pursued: a graphic/image server and a help/doc server are currently being added to it. They should further enhance the user independence and access transparency.

  12. The pulsar spectral index distribution

    NASA Astrophysics Data System (ADS)

    Bates, S. D.; Lorimer, D. R.; Verbiest, J. P. W.

    2013-05-01

    The flux-density spectra of radio pulsars are known to be steep and, to first order, described by a power-law relationship of the form Sν ∝ να, where Sν is the flux density at some frequency ν and α is the spectral index. Although measurements of α have been made over the years for several hundred pulsars, a study of the intrinsic distribution of pulsar spectra has not been carried out. From the result of pulsar surveys carried out at three different radio frequencies, we use population synthesis techniques and a likelihood analysis to deduce what underlying spectral index distribution is required to replicate the results of these surveys. We find that in general the results of the surveys can be modelled by a Gaussian distribution of spectral indices with a mean of -1.4 and unit standard deviation. We also consider the impact of the so-called gigahertz-peaked spectrum pulsars proposed by Kijak et al. The fraction of peaked-spectrum sources in the population with any significant turnover at low frequencies appears to be at most 10 per cent. We demonstrate that high-frequency (>2 GHz) surveys preferentially select flatter spectrum pulsars and the converse is true for lower frequency (<1 GHz) surveys. This implies that any correlations between α and other pulsar parameters (for example age or magnetic field) need to carefully account for selection biases in pulsar surveys. We also expect that many known pulsars which have been detected at high frequencies will have shallow, or positive, spectral indices. The majority of pulsars do not have recorded flux density measurements over a wide frequency range, making it impossible to constrain their spectral shapes. We also suggest that such measurements would allow an improved description of any populations of pulsars with `non-standard' spectra. Further refinements to this picture will soon be possible from the results of surveys with the Green Bank Telescope and LOFAR.

  13. Atlas of quasar energy distributions

    NASA Technical Reports Server (NTRS)

    Elvis, Martin; Wilkes, Belinda J.; Mcdowell, Jonathan C.; Green, Richard F.; Bechtold, Jill; Willner, S. P.; Oey, M. S.; Polomski, Elisha; Cutri, Roc

    1994-01-01

    We present an atlas of the spectral energy distributions (SEDs) of normal, nonblazar, quasars over the whole available range (radio to 10 keV X-rays) of the electromagnetic spectrum. The primary (UVSX) sample includes 47 quasars for which the spectral energy distributions include X-ray spectral indices and UV data. Of these, 29 are radio quiet, and 18 are radio loud. The SEDs are presented both in figures and in tabular form, with additional tabular material published on CD-ROM. Previously unpublished observational data for a second set of quasars excluded from the primary sample are also tabulated. The effects of host galaxy starlight contamination and foreground extinction on the UVSX sample are considered and the sample is used to investigate the range of SED properties. Of course, the properties we derive are influenced strongly by the selection effects induced by quasar discovery techniques. We derive the mean energy distribution (MED) for radio-loud and radio-quiet objects and present the bolometric corrections derived from it. We note, however, that the dispersion about this mean is large (approximately one decade for both the infrared and ultraviolet components when the MED is normalized at the near-infrared inflection). At least part of the dispersion in the ultraviolet may be due to time variability, but this is unlikely to be important in the infrared. The existence of such a large dispersion indicates that the MED reflects only some of the properties of quasars and so should be used only with caution.

  14. Force distributions in granular materials

    NASA Astrophysics Data System (ADS)

    Jaeger, Heinrich M.

    2002-03-01

    A fundamental problem in the study of disordered materials concerns the propagation of forces. Static granular media, such as sand particles inside a rigid container, have emerged as an important model system as they embody the zero temperature limit of disordered materials comprised of hardsphere repulsive particles. In this talk, I will review recent results on the distribution forces along the boundaries of granular material subjected to an applied load. While the spatial distribution of mean forces sensitively reflects the (macroscopic) packing structure of the material, the ensemble-averaged probability distribution of force fluctuations around the mean value, P(f), exhibits universal behavior. The shape of P(f) is found to be independent not only of the macroscopic packing arrangement but also of the inter-particle friction and, over a wide range, of the applied external stress. This shape is characterized by an exponential decay in the probability density for fluctuations above the mean force and only a small reduction, by no more than a factor two, for fluctuations below the mean [1]. Surprisingly, the exponential, non-Gaussian behavior appears to hold up even in the case of highly compressible grains, and it also has been observed in simulations of supercooled liquids [2]. I will discuss the implications of these findings on our current understanding of stress transmission in disordered media in general, and on glassy behavior in particular. [1] D. L. Blair, N. W. Mueggenburg, A. H. Marshall, H. M. Jaeger, and S. R. Nagel, Phys. Rev. E 63, 041304 (2001). [2] S. O’Hern, S. A. Langer, A. J. Liu, and S. R. Nagel, Phys. Rev. Lett. 86, 111 (2001). * Work performed in collaboration with D. L. Blair, J. M. Erikson, A. H. Marshall, N. W. Mueggenburg, and S. R. Nagel.

  15. Universality of Charged Multiplicity Distributions

    SciTech Connect

    Goulianos, K.; /Rockefeller U.

    1981-12-01

    The charged multiplicity distributions of the diffractive and non-diffractive components of hadronic interactions, as well as those of hadronic states produced in other reactions, are described well by a universal Gaussian function that depends only on the available mass for pionization, has a maximum at n{sub o} {approx_equal} 2M{sup 1/2}, where M is the available mass in GeV, and a peak to width ratio n{sub o}/D {approx_equal} 2.

  16. The Distributed Processing Library (DPL)

    NASA Astrophysics Data System (ADS)

    Allan, D. J.

    The Distributed Processing Library (DPL) provides multiple processing services across heterogeneous collections of UNIX workstations for the ASTERIX data analysis package. The DPL programmer provides a worker task to perform the units of parallel computation, and writes the flow control logic in the client using DPL to manage queues of jobs on multiple workers. DPL conceals the interprocess communication from the client and worker processes allowing existing sequential algorithms to be adapted easily. The system has been tested on a mixture of machines running Solaris and OSF, and has shown that the library is useful for units of computation taking as little as 50 milliseconds.

  17. Flux distributions in jointed ? tapes

    NASA Astrophysics Data System (ADS)

    Koblischka, M. R.; Johansen, T. H.; Bratsberg, H.; Vase, P.

    1998-06-01

    Superconducting joints between monofilamentary, Ag-sheathed 0953-2048/11/6/005/img8 tapes were investigated by means of magneto-optic imaging. Two types of joint were studied; one joint with direct contact between the tape cores, and the other one with an Ag layer between them. The local flux distributions directly reveal the obstacles hindering the current flow through the joints. The direct contact of the tape cores provides joints which can carry about 80% of the current of the original tape, whereas the joints with the Ag layer are considerably worse. This difference becomes even more drastic in applied magnetic fields.

  18. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  19. Distributed intelligence for supervisory control

    NASA Technical Reports Server (NTRS)

    Wolfe, W. J.; Raney, S. D.

    1987-01-01

    Supervisory control systems must deal with various types of intelligence distributed throughout the layers of control. Typical layers are real-time servo control, off-line planning and reasoning subsystems and finally, the human operator. Design methodologies must account for the fact that the majority of the intelligence will reside with the human operator. Hierarchical decompositions and feedback loops as conceptual building blocks that provide a common ground for man-machine interaction are discussed. Examples of types of parallelism and parallel implementation on several classes of computer architecture are also discussed.

  20. Synchronous Sampling for Distributed Experiments

    NASA Astrophysics Data System (ADS)

    Wittkamp, M.; Ettl, J.

    2015-09-01

    Sounding Rocket payloads, especially for atmospheric research, often consists of several independent sensors or experiments with different objectives. The data of these sensors can be combined in the post processing to improve the scientific results of the flight. One major requirement for this data-correlation is a common timeline for the measurements of the distributed experiments. Within this paper we present two ways to achieve absolute timing for asynchronously working experiments. The synchronization process is using the Global Positioning System (GPS) and a standard serial communication protocol for transport of timestamps and flight-states.

  1. Distributed Control with Collective Intelligence

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Wheeler, Kevin R.; Tumer, Kagan

    1998-01-01

    We consider systems of interacting reinforcement learning (RL) algorithms that do not work at cross purposes , in that their collective behavior maximizes a global utility function. We call such systems COllective INtelligences (COINs). We present the theory of designing COINs. Then we present experiments validating that theory in the context of two distributed control problems: We show that COINs perform near-optimally in a difficult variant of Arthur's bar problem [Arthur] (and in particular avoid the tragedy of the commons for that problem), and we also illustrate optimal performance in the master-slave problem.

  2. Video distribution system cost model

    NASA Technical Reports Server (NTRS)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  3. Microbial distribution of selenocysteine lyase.

    PubMed Central

    Chocat, P; Esaki, N; Nakamura, T; Tanaka, H; Soda, K

    1983-01-01

    We studied the distribution of selenocysteine lyase, a novel enzyme catalyzing the conversion of selenocysteine into alanine and H2Se, which we first demonstrated in various mammalian tissues (Esaki et al., J. Biol. Chem. 257:4386-4391, 1982). Enzyme activity was found in various bacteria such as Alcaligenes viscolactis and Pseudomonas alkanolytica. No significant activity was found in yeasts and fungi. Selenocysteine lyases from A. viscolactis and P. alkanolytica acted specifically on L-selenocysteine and required pyridoxal 5'-phosphate as a cofactor. PMID:6225771

  4. The CJ12 parton distributions

    SciTech Connect

    Accardi, Alberto; Owens, Jeff F.

    2013-07-01

    Three new sets of next-to-leading order parton distribution functions (PDFs) are presented, determined by global fits to a wide variety of data for hard scattering processes. The analysis includes target mass and higher twist corrections needed for the description of deep-inelastic scattering data at large x and low Q^2, and nuclear corrections for deuterium targets. The PDF sets correspond to three different models for the nuclear effects, and provide a more realistic uncertainty range for the d quark PDF compared with previous fits. Applications to weak boson production at colliders are also discussed.

  5. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  6. Heat distribution by natural convection

    SciTech Connect

    Balcomb, J.D.

    1985-01-01

    Natural convection can provide adequate heat distribution in many situtations that arise in buildings. This is appropriate, for example, in passive solar buildings where some rooms tend to be more strongly solar heated than others or to reduce the number of heating units required in a building. Natural airflow and heat transport through doorways and other internal building apertures is predictable and can be accounted for in the design. The nature of natural convection is described, and a design chart is presented appropriate to a simple, single-doorway situation. Natural convective loops that can occur in buildings are described and a few design guidelines are presented.

  7. Workshop on momentum distributions: Summary

    SciTech Connect

    Simmons, R.O.

    1988-01-01

    This has been an extraordinary Workshop touching many branches of physics. The Workshop has treated momentum distributions in fluid and solid condensed matter, in nuclei, and in electronic systems. Both theoretical and experimental concepts and methods have been considered in all these branches. A variety of specific illustrations and applications in physical systems have been presented. One finds that some common unifying themes emerge. One finds, also, that some examples are available to illustrate where one branch is more mature than others and to contrast where expectations for future progress may be most encouraged. 6 refs., 2 figs.

  8. Model-free distributed learning

    NASA Technical Reports Server (NTRS)

    Dembo, Amir; Kailath, Thomas

    1990-01-01

    Model-free learning for synchronous and asynchronous quasi-static networks is presented. The network weights are continuously perturbed, while the time-varying performance index is measured and correlated with the perturbation signals; the correlation output determines the changes in the weights. The perturbation may be either via noise sources or orthogonal signals. The invariance to detailed network structure mitigates large variability between supposedly identical networks as well as implementation defects. This local, regular, and completely distributed mechanism requires no central control and involves only a few global signals. Thus it allows for integrated on-chip learning in large analog and optical networks.

  9. Measurement Of Spectral Power Distribution

    NASA Astrophysics Data System (ADS)

    Moore, J. R.

    1980-11-01

    The majority of spectroradiometers make measurements at a number of discrete wavelength settings spaced evenly across the spectrum. Many modern light sources such as fluorescent or metal halide lamps have complex line spectra which may not be properly evaluated by this method. An automated spectroradiometer system involving a non-stop spectral scan with continuous integration of the output signal from the detector is described. The method is designed to make accurate measurements of all types of spectral power distribution whether made up of lines or continuum or any mixture of the two.

  10. Distributed optimization system and method

    DOEpatents

    Hurtado, John E.; Dohrmann, Clark R.; Robinett, III, Rush D.

    2003-06-10

    A search system and method for controlling multiple agents to optimize an objective using distributed sensing and cooperative control. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace. The objective can be: chemical sources, temperature sources, radiation sources, light sources, evaders, trespassers, explosive sources, time dependent sources, time independent sources, function surfaces, maximization points, minimization points, and optimal control of a system such as a communication system, an economy, a crane, and a multi-processor computer.

  11. Plankton distribution and ocean dispersal.

    PubMed

    McManus, Margaret Anne; Woodson, C Brock

    2012-03-15

    Plankton are small organisms that dwell in oceans, seas and bodies of fresh water. In this review, we discuss life in the plankton, which involves a balance between the behavioral capabilities of the organism and the characteristics and movement of the water that surrounds it. In order to consider this balance, we discuss how plankton interact with their environment across a range of scales - from the smallest viruses and bacteria to larger phytoplankton and zooplankton. We find that the larger scale distributions of plankton, observed in coastal waters, along continental shelves and in ocean basins, are highly dependent upon the smaller scale interactions between the individual organism and its environment. Further, we discuss how larger scale organism distributions may affect the transport and/or retention of plankton in the ocean environment. The research reviewed here provides a mechanistic understanding of how organism behavior in response to the physical environment produces planktonic aggregations, which has a direct impact on the way marine ecosystems function. PMID:22357594

  12. Analyzing ion distributions around DNA.

    PubMed

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation. PMID:24906882

  13. Spatial Distributions of Young Stars

    NASA Astrophysics Data System (ADS)

    Kraus, Adam L.; Hillenbrand, Lynne A.

    2008-10-01

    We analyze the spatial distribution of young stars in Taurus-Auriga and Upper Sco, as determined from the two-point correlation function (i.e., the mean surface density of neighbors). The corresponding power-law fits allow us to determine the fractal dimensions of each association's spatial distribution, measure the stellar velocity dispersions, and distinguish between the bound binary population and chance alignments of members. We find that the fractal dimension of Taurus is D ~ 1.05, consistent with its filamentary structure. The fractal dimension of Upper Sco may be even shallower (D ~ 0.7), but this fit is uncertain due to the limited area and possible spatially variable incompleteness. We also find that random stellar motions have erased all primordial structure on scales of lsim0.07° in Taurus and lsim1.7° in Upper Sco; given ages of ~1 and ~5 Myr, the corresponding internal velocity dispersions are ~0.2 and ~1.0 km s-1, respectively. Finally, we find that binaries can be distinguished from chance alignments at separations of lsim120'' (17,000 AU) in Taurus and lsim75'' (11,000 AU) in Upper Sco. The binary populations in these associations that we previously studied, spanning separations of 3''-30'', is dominated by binary systems. However, the few lowest mass pairs (Mprim <~ 0.3 M⊙) might be chance alignments.

  14. Distributed computing at the SSCL

    SciTech Connect

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given.

  15. ANNs pinpoint underground distribution faults

    SciTech Connect

    Glinkowski, M.T.; Wang, N.C.

    1995-10-01

    Many offline fault location techniques in power distribution circuits involve patrolling along the lines or cables. In overhead distribution lines, most of the failures can be located quickly by visual inspection without the aid of special equipment. However, locating a fault in underground cable systems is more difficult. It involves additional equipment (e.g., thumpers, radars, etc.) to transform the invisibility of the cable into other forms of signals, such as acoustic sound and electromagnetic pulses. Trained operators must carry the equipment above the ground, follow the path of the signal, and draw lines on their maps in order to locate the fault. Sometimes, even smelling the burnt cable faults is a way of detecting the problem. These techniques are time consuming, not always reliable, and, as in the case of high-voltage dc thumpers, can cause additional damage to the healthy parts of the cable circuit. Online fault location in power networks that involve interconnected lines (cables) and multiterminal sources continues receiving great attention, with limited success in techniques that would provide simple and practical solutions. This article features a new online fault location technique that: uses the pattern recognition feature of artificial neural networks (ANNs); utilizes new capabilities of modern protective relaying hardware. The output of the neural network can be graphically displayed as a simple three-dimensional (3-D) chart that can provide an operator with an instantaneous indication of the location of the fault.

  16. Countercurrent distribution of biological cells

    NASA Technical Reports Server (NTRS)

    1982-01-01

    It is known that the addition of phosphate buffer to two polymer aqueous phase systems has a strong effect on the partition behavior of cells and other particles in such mixtures. The addition of sodium phosphate to aqueous poly(ethylene glycol) dextran phase systems causes a concentration-dependent shift in binodial on the phase diagram, progressively lowering the critical conditions for phase separation as the phosphate concentration is increased. Sodium chloride produces no significant shift in the critical point relative to the salt-free case. Accurate determinations of the phase diagram require measurements of the density of the phases; data is presented which allows this parameter to be calculated from polarimetric measurements of the dextran concentrations of both phases. Increasing polymer concentrations in the phase systems produce increasing preference of the phosphate for the dextran-rich bottom phase. Equilibrium dialysis experiments showed that poly(ethylene glycol) effectively rejected phosphate, and to a lesser extent chloride, but that dextran had little effect on the distribution of either salt. Increasing ionic strength via addition of 0.15 M NaCl to phase systems containing 0.01 M phosphate produces an increased concentration of phosphate ions in the bottom dextran-rich phase, the expected effect in this type of Donnan distribution.

  17. Distributed nestmate recognition in ants

    PubMed Central

    Esponda, Fernando; Gordon, Deborah M.

    2015-01-01

    We propose a distributed model of nestmate recognition, analogous to the one used by the vertebrate immune system, in which colony response results from the diverse reactions of many ants. The model describes how individual behaviour produces colony response to non-nestmates. No single ant knows the odour identity of the colony. Instead, colony identity is defined collectively by all the ants in the colony. Each ant responds to the odour of other ants by reference to its own unique decision boundary, which is a result of its experience of encounters with other ants. Each ant thus recognizes a particular set of chemical profiles as being those of non-nestmates. This model predicts, as experimental results have shown, that the outcome of behavioural assays is likely to be variable, that it depends on the number of ants tested, that response to non-nestmates changes over time and that it changes in response to the experience of individual ants. A distributed system allows a colony to identify non-nestmates without requiring that all individuals have the same complete information and helps to facilitate the tracking of changes in cuticular hydrocarbon profiles, because only a subset of ants must respond to provide an adequate response. PMID:25833853

  18. Ising model for distribution networks

    NASA Astrophysics Data System (ADS)

    Hooyberghs, H.; Van Lombeek, S.; Giuraniuc, C.; Van Schaeybroeck, B.; Indekeu, J. O.

    2012-01-01

    An elementary Ising spin model is proposed for demonstrating cascading failures (breakdowns, blackouts, collapses, avalanches, etc.) that can occur in realistic networks for distribution and delivery by suppliers to consumers. A ferromagnetic Hamiltonian with quenched random fields results from policies that maximize the gap between demand and delivery. Such policies can arise in a competitive market where firms artificially create new demand, or in a solidarity environment where too high a demand cannot reasonably be met. Network failure in the context of a policy of solidarity is possible when an initially active state becomes metastable and decays to a stable inactive state. We explore the characteristics of the demand and delivery, as well as the topological properties, which make the distribution network susceptible of failure. An effective temperature is defined, which governs the strength of the activity fluctuations which can induce a collapse. Numerical results, obtained by Monte Carlo simulations of the model on (mainly) scale-free networks, are supplemented with analytic mean-field approximations to the geometrical random field fluctuations and the thermal spin fluctuations. The role of hubs versus poorly connected nodes in initiating the breakdown of network activity is illustrated and related to model parameters.

  19. Overlapping clusters for distributed computation.

    SciTech Connect

    Mirrokni, Vahab; Andersen, Reid; Gleich, David F.

    2010-11-01

    Scalable, distributed algorithms must address communication problems. We investigate overlapping clusters, or vertex partitions that intersect, for graph computations. This setup stores more of the graph than required but then affords the ease of implementation of vertex partitioned algorithms. Our hope is that this technique allows us to reduce communication in a computation on a distributed graph. The motivation above draws on recent work in communication avoiding algorithms. Mohiyuddin et al. (SC09) design a matrix-powers kernel that gives rise to an overlapping partition. Fritzsche et al. (CSC2009) develop an overlapping clustering for a Schwarz method. Both techniques extend an initial partitioning with overlap. Our procedure generates overlap directly. Indeed, Schwarz methods are commonly used to capitalize on overlap. Elsewhere, overlapping communities (Ahn et al, Nature 2009; Mishra et al. WAW2007) are now a popular model of structure in social networks. These have long been studied in statistics (Cole and Wishart, CompJ 1970). We present two types of results: (i) an estimated swapping probability {rho}{infinity}; and (ii) the communication volume of a parallel PageRank solution (link-following {alpha} = 0.85) using an additive Schwarz method. The volume ratio is the amount of extra storage for the overlap (2 means we store the graph twice). Below, as the ratio increases, the swapping probability and PageRank communication volume decreases.

  20. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.