Sample records for space-time permutation scan

  1. A bootstrap based space-time surveillance model with an application to crime occurrences

    NASA Astrophysics Data System (ADS)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  2. Daily Reportable Disease Spatiotemporal Cluster Detection, New York City, New York, USA, 2014-2015.

    PubMed

    Greene, Sharon K; Peterson, Eric R; Kapell, Deborah; Fine, Annie D; Kulldorff, Martin

    2016-10-01

    Each day, the New York City Department of Health and Mental Hygiene uses the free SaTScan software to apply prospective space-time permutation scan statistics to strengthen early outbreak detection for 35 reportable diseases. This method prompted early detection of outbreaks of community-acquired legionellosis and shigellosis.

  3. Fermion systems in discrete space-time

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    2007-05-01

    Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.

  4. A Space–Time Permutation Scan Statistic for Disease Outbreak Detection

    PubMed Central

    Kulldorff, Martin; Heffernan, Richard; Hartman, Jessica; Assunção, Renato; Mostashari, Farzad

    2005-01-01

    Background The ability to detect disease outbreaks early is important in order to minimize morbidity and mortality through timely implementation of disease prevention and control measures. Many national, state, and local health departments are launching disease surveillance systems with daily analyses of hospital emergency department visits, ambulance dispatch calls, or pharmacy sales for which population-at-risk information is unavailable or irrelevant. Methods and Findings We propose a prospective space–time permutation scan statistic for the early detection of disease outbreaks that uses only case numbers, with no need for population-at-risk data. It makes minimal assumptions about the time, geographical location, or size of the outbreak, and it adjusts for natural purely spatial and purely temporal variation. The new method was evaluated using daily analyses of hospital emergency department visits in New York City. Four of the five strongest signals were likely local precursors to citywide outbreaks due to rotavirus, norovirus, and influenza. The number of false signals was at most modest. Conclusion If such results hold up over longer study times and in other locations, the space–time permutation scan statistic will be an important tool for local and national health departments that are setting up early disease detection surveillance systems. PMID:15719066

  5. Vision-Based Navigation and Parallel Computing

    DTIC Science & Technology

    1990-08-01

    33 5.8. Behizad Kamgar-Parsi and Behrooz Karngar-Parsi,"On Problem 5- lving with Hopfield Neural Networks", CAR-TR-462, CS-TR...Second. the hypercube connections support logarithmic implementations of fundamental parallel algorithms. such as grid permutations and scan...the pose space. It also uses a set of virtual processors to represent an orthogonal projection grid , and projections of the six dimensional pose space

  6. Simultaneous and Sequential MS/MS Scan Combinations and Permutations in a Linear Quadrupole Ion Trap.

    PubMed

    Snyder, Dalton T; Szalwinski, Lucas J; Cooks, R Graham

    2017-10-17

    Methods of performing precursor ion scans as well as neutral loss scans in a single linear quadrupole ion trap have recently been described. In this paper we report methodology for performing permutations of MS/MS scan modes, that is, ordered combinations of precursor, product, and neutral loss scans following a single ion injection event. Only particular permutations are allowed; the sequences demonstrated here are (1) multiple precursor ion scans, (2) precursor ion scans followed by a single neutral loss scan, (3) precursor ion scans followed by product ion scans, and (4) segmented neutral loss scans. (5) The common product ion scan can be performed earlier in these sequences, under certain conditions. Simultaneous scans can also be performed. These include multiple precursor ion scans, precursor ion scans with an accompanying neutral loss scan, and multiple neutral loss scans. We argue that the new capability to perform complex simultaneous and sequential MS n operations on single ion populations represents a significant step in increasing the selectivity of mass spectrometry.

  7. Spatio-temporal scan statistics for the detection of outbreaks involving common molecular subtypes: using human cases of Escherichia coli O157:H7 provincial PFGE pattern 8 (National Designation ECXAI.0001) in Alberta as an example.

    PubMed

    So, H C; Pearl, D L; von Königslöw, T; Louie, M; Chui, L; Svenson, L W

    2013-08-01

    Molecular typing methods have become a common part of the surveillance of foodborne pathogens. In particular, pulsed-field gel electrophoresis (PFGE) has been used successfully to identify outbreaks of Escherichia coli O157:H7 in humans from a variety of food and environmental sources. However, some PFGE patterns appear commonly in surveillance systems, making it more difficult to distinguish between outbreak and sporadic cases based on molecular data alone. In addition, it is unknown whether these common patterns might have unique epidemiological characteristics reflected in their spatial and temporal distributions. Using E. coli O157:H7 surveillance data from Alberta, collected from 2000 to 2002, we investigated whether E. coli O157:H7 with provincial PFGE pattern 8 (national designation ECXAI.0001) clustered in space, time and space-time relative to other PFGE patterns using the spatial scan statistic. Based on our purely spatial and temporal scans using a Bernoulli model, there did not appear to be strong evidence that isolates of E. coli O157:H7 with provincial PFGE pattern 8 are distributed differently from other PFGE patterns. However, we did identify space-time clusters of isolates with PFGE pattern 8, using a Bernoulli model and a space-time permutation model, which included known outbreaks and potentially unrecognized outbreaks or additional outbreak cases. There were differences between the two models in the space-time clusters identified, which suggests that the use of both models could increase the sensitivity of a quantitative surveillance system for identifying outbreaks involving isolates sharing a common PFGE pattern. © 2012 Blackwell Verlag GmbH.

  8. Wildfire cluster detection using space-time scan statistics

    NASA Astrophysics Data System (ADS)

    Tonini, M.; Tuia, D.; Ratle, F.; Kanevski, M.

    2009-04-01

    The aim of the present study is to identify spatio-temporal clusters of fires sequences using space-time scan statistics. These statistical methods are specifically designed to detect clusters and assess their significance. Basically, scan statistics work by comparing a set of events occurring inside a scanning window (or a space-time cylinder for spatio-temporal data) with those that lie outside. Windows of increasing size scan the zone across space and time: the likelihood ratio is calculated for each window (comparing the ratio "observed cases over expected" inside and outside): the window with the maximum value is assumed to be the most probable cluster, and so on. Under the null hypothesis of spatial and temporal randomness, these events are distributed according to a known discrete-state random process (Poisson or Bernoulli), which parameters can be estimated. Given this assumption, it is possible to test whether or not the null hypothesis holds in a specific area. In order to deal with fires data, the space-time permutation scan statistic has been applied since it does not require the explicit specification of the population-at risk in each cylinder. The case study is represented by Florida daily fire detection using the Moderate Resolution Imaging Spectroradiometer (MODIS) active fire product during the period 2003-2006. As result, statistically significant clusters have been identified. Performing the analyses over the entire frame period, three out of the five most likely clusters have been identified in the forest areas, on the North of the country; the other two clusters cover a large zone in the South, corresponding to agricultural land and the prairies in the Everglades. Furthermore, the analyses have been performed separately for the four years to analyze if the wildfires recur each year during the same period. It emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the South areas they are widely present along the whole year. The analysis of fires distribution to evaluate if they are statistically more frequent in some area or/and in some period of the year, can be useful to support fire management and to focus on prevention measures.

  9. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  10. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  11. Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation

    PubMed Central

    Recchia, Gabriel; Sahlgren, Magnus; Kanerva, Pentti; Jones, Michael N.

    2015-01-01

    Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics. PMID:25954306

  12. Revisiting the European sovereign bonds with a permutation-information-theory approach

    NASA Astrophysics Data System (ADS)

    Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-12-01

    In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.

  13. Circular Permutation of a Chaperonin Protein: Biophysics and Application to Nanotechnology

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; Chan, Suzanne; Li, Yi-Fen; McMillan, R. Andrew; Trent, Jonathan

    2004-01-01

    We have designed five circular permutants of a chaperonin protein derived from the hyperthermophilic organism Sulfolobus shibatae. These permuted proteins were expressed in E. coli and are well-folded. Furthermore, all the permutants assemble into 18-mer double rings of the same form as the wild-type protein. We characterized the thermodynamics of folding for each permutant by both guanidine denaturation and differential scanning calorimetry. We also examined the assembly of chaperonin rings into higher order structures that may be used as nanoscale templates. The results show that circular permutation can be used to tune the thermodynamic properties of a protein template as well as facilitating the fusion of peptides, binding proteins or enzymes onto nanostructured templates.

  14. Permutation-invariant distance between atomic configurations

    NASA Astrophysics Data System (ADS)

    Ferré, Grégoire; Maillet, Jean-Bernard; Stoltz, Gabriel

    2015-09-01

    We present a permutation-invariant distance between atomic configurations, defined through a functional representation of atomic positions. This distance enables us to directly compare different atomic environments with an arbitrary number of particles, without going through a space of reduced dimensionality (i.e., fingerprints) as an intermediate step. Moreover, this distance is naturally invariant through permutations of atoms, avoiding the time consuming associated minimization required by other common criteria (like the root mean square distance). Finally, the invariance through global rotations is accounted for by a minimization procedure in the space of rotations solved by Monte Carlo simulated annealing. A formal framework is also introduced, showing that the distance we propose verifies the property of a metric on the space of atomic configurations. Two examples of applications are proposed. The first one consists in evaluating faithfulness of some fingerprints (or descriptors), i.e., their capacity to represent the structural information of a configuration. The second application concerns structural analysis, where our distance proves to be efficient in discriminating different local structures and even classifying their degree of similarity.

  15. Adinkra (in)equivalence from Coxeter group representations: A case study

    NASA Astrophysics Data System (ADS)

    Chappell, Isaac; Gates, S. James; Hübsch, T.

    2014-02-01

    Using a MathematicaTM code, we present a straightforward numerical analysis of the 384-dimensional solution space of signed permutation 4×4 matrices, which in sets of four, provide representations of the 𝒢ℛ(4, 4) algebra, closely related to the 𝒩 = 1 (simple) supersymmetry algebra in four-dimensional space-time. Following after ideas discussed in previous papers about automorphisms and classification of adinkras and corresponding supermultiplets, we make a new and alternative proposal to use equivalence classes of the (unsigned) permutation group S4 to define distinct representations of higher-dimensional spin bundles within the context of adinkras. For this purpose, the definition of a dual operator akin to the well-known Hodge star is found to partition the space of these 𝒢ℛ(4, 4) representations into three suggestive classes.

  16. Higher order explicit symmetric integrators for inseparable forms of coordinates and momenta

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Wu, Xin; Huang, Guoqing; Liu, Fuyao

    2016-06-01

    Pihajoki proposed the extended phase-space second-order explicit symmetric leapfrog methods for inseparable Hamiltonian systems. On the basis of this work, we survey a critical problem on how to mix the variables in the extended phase space. Numerical tests show that sequent permutations of coordinates and momenta can make the leapfrog-like methods yield the most accurate results and the optimal long-term stabilized error behaviour. We also present a novel method to construct many fourth-order extended phase-space explicit symmetric integration schemes. Each scheme represents the symmetric production of six usual second-order leapfrogs without any permutations. This construction consists of four segments: the permuted coordinates, triple product of the usual second-order leapfrog without permutations, the permuted momenta and the triple product of the usual second-order leapfrog without permutations. Similarly, extended phase-space sixth, eighth and other higher order explicit symmetric algorithms are available. We used several inseparable Hamiltonian examples, such as the post-Newtonian approach of non-spinning compact binaries, to show that one of the proposed fourth-order methods is more efficient than the existing methods; examples include the fourth-order explicit symplectic integrators of Chin and the fourth-order explicit and implicit mixed symplectic integrators of Zhong et al. Given a moderate choice for the related mixing and projection maps, the extended phase-space explicit symplectic-like methods are well suited for various inseparable Hamiltonian problems. Samples of these problems involve the algorithmic regularization of gravitational systems with velocity-dependent perturbations in the Solar system and post-Newtonian Hamiltonian formulations of spinning compact objects.

  17. Fast algorithms for transforming back and forth between a signed permutation and its equivalent simple permutation.

    PubMed

    Gog, Simon; Bader, Martin

    2008-10-01

    The problem of sorting signed permutations by reversals is a well-studied problem in computational biology. The first polynomial time algorithm was presented by Hannenhalli and Pevzner in 1995. The algorithm was improved several times, and nowadays the most efficient algorithm has a subquadratic running time. Simple permutations played an important role in the development of these algorithms. Although the latest result of Tannier et al. does not require simple permutations, the preliminary version of their algorithm as well as the first polynomial time algorithm of Hannenhalli and Pevzner use the structure of simple permutations. More precisely, the latter algorithms require a precomputation that transforms a permutation into an equivalent simple permutation. To the best of our knowledge, all published algorithms for this transformation have at least a quadratic running time. For further investigations on genome rearrangement problems, the existence of a fast algorithm for the transformation could be crucial. Another important task is the back transformation, i.e. if we have a sorting on the simple permutation, transform it into a sorting on the original permutation. Again, the naive approach results in an algorithm with quadratic running time. In this paper, we present a linear time algorithm for transforming a permutation into an equivalent simple permutation, and an O(n log n) algorithm for the back transformation of the sorting sequence.

  18. Effective Iterated Greedy Algorithm for Flow-Shop Scheduling Problems with Time lags

    NASA Astrophysics Data System (ADS)

    ZHAO, Ning; YE, Song; LI, Kaidian; CHEN, Siyu

    2017-05-01

    Flow shop scheduling problem with time lags is a practical scheduling problem and attracts many studies. Permutation problem(PFSP with time lags) is concentrated but non-permutation problem(non-PFSP with time lags) seems to be neglected. With the aim to minimize the makespan and satisfy time lag constraints, efficient algorithms corresponding to PFSP and non-PFSP problems are proposed, which consist of iterated greedy algorithm for permutation(IGTLP) and iterated greedy algorithm for non-permutation (IGTLNP). The proposed algorithms are verified using well-known simple and complex instances of permutation and non-permutation problems with various time lag ranges. The permutation results indicate that the proposed IGTLP can reach near optimal solution within nearly 11% computational time of traditional GA approach. The non-permutation results indicate that the proposed IG can reach nearly same solution within less than 1% computational time compared with traditional GA approach. The proposed research combines PFSP and non-PFSP together with minimal and maximal time lag consideration, which provides an interesting viewpoint for industrial implementation.

  19. Quantum one-way permutation over the finite field of two elements

    NASA Astrophysics Data System (ADS)

    de Castro, Alexandre

    2017-06-01

    In quantum cryptography, a one-way permutation is a bounded unitary operator U:{H} → {H} on a Hilbert space {H} that is easy to compute on every input, but hard to invert given the image of a random input. Levin (Probl Inf Transm 39(1):92-103, 2003) has conjectured that the unitary transformation g(a,x)=(a,f(x)+ax), where f is any length-preserving function and a,x \\in {GF}_{{2}^{\\Vert x\\Vert }}, is an information-theoretically secure operator within a polynomial factor. Here, we show that Levin's one-way permutation is provably secure because its output values are four maximally entangled two-qubit states, and whose probability of factoring them approaches zero faster than the multiplicative inverse of any positive polynomial poly( x) over the Boolean ring of all subsets of x. Our results demonstrate through well-known theorems that existence of classical one-way functions implies existence of a universal quantum one-way permutation that cannot be inverted in subexponential time in the worst case.

  20. Identifying sighting clusters of endangered taxa with historical records.

    PubMed

    Duffy, Karl J

    2011-04-01

    The probability and time of extinction of taxa is often inferred from statistical analyses of historical records. Many of these analyses require the exclusion of multiple records within a unit of time (i.e., a month or a year). Nevertheless, spatially explicit, temporally aggregated data may be useful for identifying clusters of sightings (i.e., sighting clusters) in space and time. Identification of sighting clusters highlights changes in the historical recording of endangered taxa. I used two methods to identify sighting clusters in historical records: the Ederer-Myers-Mantel (EMM) test and the space-time permutation scan (STPS). I applied these methods to the spatially explicit sighting records of three species of orchids that are listed as endangered in the Republic of Ireland under the Wildlife Act (1976): Cephalanthera longifolia, Hammarbya paludosa, and Pseudorchis albida. Results with the EMM test were strongly affected by the choice of the time interval, and thus the number of temporal samples, used to examine the records. For example, sightings of P. albida clustered when the records were partitioned into 20-year temporal samples, but not when they were partitioned into 22-year temporal samples. Because the statistical power of EMM was low, it will not be useful when data are sparse. Nevertheless, the STPS identified regions that contained sighting clusters because it uses a flexible scanning window (defined by cylinders of varying size that move over the study area and evaluate the likelihood of clustering) to detect them, and it identified regions with high and regions with low rates of orchid sightings. The STPS analyses can be used to detect sighting clusters of endangered species that may be related to regions of extirpation and may assist in the categorization of threat status. ©2010 Society for Conservation Biology.

  1. Spatial-temporal clustering of companion animal enteric syndrome: detection and investigation through the use of electronic medical records from participating private practices.

    PubMed

    Anholt, R M; Berezowski, J; Robertson, C; Stephen, C

    2015-09-01

    There is interest in the potential of companion animal surveillance to provide data to improve pet health and to provide early warning of environmental hazards to people. We implemented a companion animal surveillance system in Calgary, Alberta and the surrounding communities. Informatics technologies automatically extracted electronic medical records from participating veterinary practices and identified cases of enteric syndrome in the warehoused records. The data were analysed using time-series analyses and a retrospective space-time permutation scan statistic. We identified a seasonal pattern of reports of occurrences of enteric syndromes in companion animals and four statistically significant clusters of enteric syndrome cases. The cases within each cluster were examined and information about the animals involved (species, age, sex), their vaccination history, possible exposure or risk behaviour history, information about disease severity, and the aetiological diagnosis was collected. We then assessed whether the cases within the cluster were unusual and if they represented an animal or public health threat. There was often insufficient information recorded in the medical record to characterize the clusters by aetiology or exposures. Space-time analysis of companion animal enteric syndrome cases found evidence of clustering. Collection of more epidemiologically relevant data would enhance the utility of practice-based companion animal surveillance.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferré, Grégoire; Maillet, Jean-Bernard; Stoltz, Gabriel

    We present a permutation-invariant distance between atomic configurations, defined through a functional representation of atomic positions. This distance enables us to directly compare different atomic environments with an arbitrary number of particles, without going through a space of reduced dimensionality (i.e., fingerprints) as an intermediate step. Moreover, this distance is naturally invariant through permutations of atoms, avoiding the time consuming associated minimization required by other common criteria (like the root mean square distance). Finally, the invariance through global rotations is accounted for by a minimization procedure in the space of rotations solved by Monte Carlo simulated annealing. A formal frameworkmore » is also introduced, showing that the distance we propose verifies the property of a metric on the space of atomic configurations. Two examples of applications are proposed. The first one consists in evaluating faithfulness of some fingerprints (or descriptors), i.e., their capacity to represent the structural information of a configuration. The second application concerns structural analysis, where our distance proves to be efficient in discriminating different local structures and even classifying their degree of similarity.« less

  3. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system

    PubMed Central

    Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J.; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method’s implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System’s C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis. PMID:28886112

  4. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system.

    PubMed

    Mathes, Robert W; Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method's implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System's C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis.

  5. Biosurveillance in a Highly Mobile Population - Year 3

    DTIC Science & Technology

    2012-07-01

    provides an opportunity- rich tested for the impact upon infectious disease modeling, biosurveillance, and public health. Kulldorf et al (2005) assessed...Secular Circles and Millenial Trends. URSS, Moscow 2006 Kulldorff M, Heffernan R, Hartman J, Assunção RM, Mostashari F. (2005). “Space-Time Permutation

  6. Efficient computation of significance levels for multiple associations in large studies of correlated data, including genomewide association studies.

    PubMed

    Dudbridge, Frank; Koeleman, Bobby P C

    2004-09-01

    Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.

  7. Permutation entropy with vector embedding delays

    NASA Astrophysics Data System (ADS)

    Little, Douglas J.; Kane, Deb M.

    2017-12-01

    Permutation entropy (PE) is a statistic used widely for the detection of structure within a time series. Embedding delay times at which the PE is reduced are characteristic timescales for which such structure exists. Here, a generalized scheme is investigated where embedding delays are represented by vectors rather than scalars, permitting PE to be calculated over a (D -1 ) -dimensional space, where D is the embedding dimension. This scheme is applied to numerically generated noise, sine wave and logistic map series, and experimental data sets taken from a vertical-cavity surface emitting laser exhibiting temporally localized pulse structures within the round-trip time of the laser cavity. Results are visualized as PE maps as a function of embedding delay, with low PE values indicating combinations of embedding delays where correlation structure is present. It is demonstrated that vector embedding delays enable identification of structure that is ambiguous or masked, when the embedding delay is constrained to scalar form.

  8. Decline causes of Koalas in South East Queensland, Australia: a 17-year retrospective study of mortality and morbidity.

    PubMed

    Gonzalez-Astudillo, Viviana; Allavena, Rachel; McKinnon, Allan; Larkin, Rebecca; Henning, Joerg

    2017-02-20

    Koala populations are in catastrophic decline in certain eastern Australian regions. Spanning from 1997-2013, a database derived from wildlife hospitals in southeast Queensland with N = 20,250 entries was classified by causes of morbidity and mortality. A total of 11 aetiologies were identified, with chlamydiosis, trauma, and wasting being most common. The clinical diagnosis at submission varied significantly over the observation period. Combinations of aetiologies were observed in 39% of koalas submitted, with chlamydiosis frequently co-occurring. Urogenital (cystitis 26.8%, bursitis 13.5%) and ocular (conjunctivitis 17.2%) chlamydiosis were the most frequently diagnosed representations of the infection. Approximately 26% of submissions comprised koalas involved in vehicle accidents that were otherwise healthy. Age and sex of the koala as well as season and submission period were compared for the case outcomes of 'dead on arrival', 'euthanized', or 'released' for the four most common clinical diagnoses using multinomial logistic regression models. Exploratory space-time permutation scans were performed and overlapping space-time clusters for chlamydiosis, motor vehicle traumas and wasting unveiled high risk areas for koala disease and injury. Our results suggest that these aetiologies are acting jointly as multifactorial determinants for the continuing decline of koalas.

  9. Decline causes of Koalas in South East Queensland, Australia: a 17-year retrospective study of mortality and morbidity

    PubMed Central

    Gonzalez-Astudillo, Viviana; Allavena, Rachel; McKinnon, Allan; Larkin, Rebecca; Henning, Joerg

    2017-01-01

    Koala populations are in catastrophic decline in certain eastern Australian regions. Spanning from 1997–2013, a database derived from wildlife hospitals in southeast Queensland with N = 20,250 entries was classified by causes of morbidity and mortality. A total of 11 aetiologies were identified, with chlamydiosis, trauma, and wasting being most common. The clinical diagnosis at submission varied significantly over the observation period. Combinations of aetiologies were observed in 39% of koalas submitted, with chlamydiosis frequently co-occurring. Urogenital (cystitis 26.8%, bursitis 13.5%) and ocular (conjunctivitis 17.2%) chlamydiosis were the most frequently diagnosed representations of the infection. Approximately 26% of submissions comprised koalas involved in vehicle accidents that were otherwise healthy. Age and sex of the koala as well as season and submission period were compared for the case outcomes of ‘dead on arrival’, ‘euthanized’, or ‘released’ for the four most common clinical diagnoses using multinomial logistic regression models. Exploratory space-time permutation scans were performed and overlapping space-time clusters for chlamydiosis, motor vehicle traumas and wasting unveiled high risk areas for koala disease and injury. Our results suggest that these aetiologies are acting jointly as multifactorial determinants for the continuing decline of koalas. PMID:28218272

  10. Decline causes of Koalas in South East Queensland, Australia: a 17-year retrospective study of mortality and morbidity

    NASA Astrophysics Data System (ADS)

    Gonzalez-Astudillo, Viviana; Allavena, Rachel; McKinnon, Allan; Larkin, Rebecca; Henning, Joerg

    2017-02-01

    Koala populations are in catastrophic decline in certain eastern Australian regions. Spanning from 1997-2013, a database derived from wildlife hospitals in southeast Queensland with N = 20,250 entries was classified by causes of morbidity and mortality. A total of 11 aetiologies were identified, with chlamydiosis, trauma, and wasting being most common. The clinical diagnosis at submission varied significantly over the observation period. Combinations of aetiologies were observed in 39% of koalas submitted, with chlamydiosis frequently co-occurring. Urogenital (cystitis 26.8%, bursitis 13.5%) and ocular (conjunctivitis 17.2%) chlamydiosis were the most frequently diagnosed representations of the infection. Approximately 26% of submissions comprised koalas involved in vehicle accidents that were otherwise healthy. Age and sex of the koala as well as season and submission period were compared for the case outcomes of ‘dead on arrival’, ‘euthanized’, or ‘released’ for the four most common clinical diagnoses using multinomial logistic regression models. Exploratory space-time permutation scans were performed and overlapping space-time clusters for chlamydiosis, motor vehicle traumas and wasting unveiled high risk areas for koala disease and injury. Our results suggest that these aetiologies are acting jointly as multifactorial determinants for the continuing decline of koalas.

  11. Space-time clustering analysis of wildfires: The influence of dataset characteristics, fire prevention policy decisions, weather and climate.

    PubMed

    Parente, Joana; Pereira, Mário G; Tonini, Marj

    2016-07-15

    The present study focuses on the dependence of the space-time permutation scan statistics (STPSS) (1) on the input database's characteristics and (2) on the use of this methodology to assess changes on the fire regime due to different type of climate and fire management activities. Based on the very strong relationship between weather and the fire incidence in Portugal, the detected clusters will be interpreted in terms of the atmospheric conditions. Apart from being the country most affected by the fires in the European context, Portugal meets all the conditions required to carry out this study, namely: (i) two long and comprehensive official datasets, i.e. the Portuguese Rural Fire Database (PRFD) and the National Mapping Burnt Areas (NMBA), respectively based on ground and satellite measurements; (ii) the two types of climate (Csb in the north and Csa in the south) that characterizes the Mediterranean basin regions most affected by the fires also divide the mainland Portuguese area; and, (iii) the national plan for the defence of forest against fires was approved a decade ago and it is now reasonable to assess its impacts. Results confirmed (1) the influence of the dataset's characteristics on the detected clusters, (2) the existence of two different fire regimes in the country promoted by the different types of climate, (3) the positive impacts of the fire prevention policy decisions and (4) the ability of the STPSS to correctly identify clusters, regarding their number, location, and space-time size in spite of eventual space and/or time splits of the datasets. Finally, the role of the weather on days when clustered fires were active was confirmed for the classes of small, medium and large fires. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. The Detection of Clusters with Spatial Heterogeneity

    ERIC Educational Resources Information Center

    Zhang, Zuoyi

    2011-01-01

    This thesis consists of two parts. In Chapter 2, we focus on the spatial scan statistics with overdispersion and Chapter 3 is devoted to the randomized permutation test for identifying local patterns of spatial association. The spatial scan statistic has been widely used in spatial disease surveillance and spatial cluster detection. To apply it, a…

  13. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    PubMed

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  14. A one-time pad color image cryptosystem based on SHA-3 and multiple chaotic systems

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Wang, Siwei; Zhang, Yingqian; Luo, Chao

    2018-04-01

    A novel image encryption algorithm is proposed that combines the SHA-3 hash function and two chaotic systems: the hyper-chaotic Lorenz and Chen systems. First, 384 bit keystream hash values are obtained by applying SHA-3 to plaintext. The sensitivity of the SHA-3 algorithm and chaotic systems ensures the effect of a one-time pad. Second, the color image is expanded into three-dimensional space. During permutation, it undergoes plane-plane displacements in the x, y and z dimensions. During diffusion, we use the adjacent pixel dataset and corresponding chaotic value to encrypt each pixel. Finally, the structure of alternating between permutation and diffusion is applied to enhance the level of security. Furthermore, we design techniques to improve the algorithm's encryption speed. Our experimental simulations show that the proposed cryptosystem achieves excellent encryption performance and can resist brute-force, statistical, and chosen-plaintext attacks.

  15. Image encryption using a synchronous permutation-diffusion technique

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey

    2017-03-01

    In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.

  16. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    PubMed

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.

  17. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    PubMed Central

    2010-01-01

    Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827

  18. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  19. Influence of the input database in detecting fire space-time clusters

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Costa, Ricardo; Tonini, Marj; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Fire incidence variability is influenced by local environmental variables such as topography, land use, vegetation and weather conditions. These induce a cluster pattern of the fire events distribution. The space-time permutation scan statistics (STPSS) method developed by Kulldorff et al. (2005) and implemented in the SaTScanTM software (http://www.satscan.org/) proves to be able to detect space-time clusters in many different fields, even when using incomplete and/or inaccurate input data. Nevertheless, the dependence of the STPSS method on the different characteristics of different datasets describing the same environmental phenomenon has not been studied yet. In this sense, the objective of this study is to assess the robustness of the STPSS for detecting real clusters using different input datasets and to justify the obtained results. This study takes advantage of the existence of two very different official fire datasets currently available for Portugal, both provided by the Institute for the Conservation of Nature and Forests. The first one is the aggregated Portuguese Rural Fire Database PRFD (Pereira et al., 2011), which is based on ground measurements and provides detailed information about the ignition and extinction date/time and the area burnt by each fire in forest, scrubs and agricultural areas. However, in the PRFD, the fire location of each fire is indicated by the name of smallest administrative unit (the parish) where the ignition occurred. Consequently, since the application of the STPSS requires the geographic coordinates of the events, the centroid of the parishes was considered. The second fire dataset is the national mapping burnt areas (NMBA), which is based on satellite measurements and delivered in shape file format. The NMBA provides a detailed spatial information (shape and size of each fire) but the temporal information is restricted to the year of occurrence. Besides these differences, the two datasets cover different periods, they comprises a quite different number of fire records and lower fire size threshold. Therefore, it was necessary to restrict both databases to a common period and fire size range. In addition, the weather conditions during the temporal dimension of the most important detected clusters were investigated since they are often very well correlated with the fire incidence. Composite analysis was used to identify and characterize the synoptic patterns of large scale climatic and dynamical meteorological fields at different levels of the atmosphere. Kulldorff, M., Heffernan, R., Hartman, J., Assunção, R., Mostashari, F., 2005. A Space-Time Permutation Scan Statistic for Disease Outbreak Detection. PLoS medicine. 2(3), 216-224. http://dx.doi.org/10.1371/journal.pmed.0020059. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I., 2011. The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, http://dx.doi.org/10.5194/nhess-11-3343-2011. This work was supported by national funds by FCT - Portuguese Foundation for Science and Technology, under the project PEst-OE/AGR/UI4033/2014 and by the project SUSTAINSYS: Environmental Sustainable Agro-Forestry Systems (NORTE-07-0124-FEDER-000044), financed by the North Portugal Regional Operational Programme (ON.2 - O Novo Norte), under the National Strategic Reference Framework (QREN), through the European Regional Development Fund (FEDER), as well as by National Funds (PIDDAC) through the Portuguese Foundation for Science and Technology (FCT/MEC).

  20. Influenza Surveillance and Incidence in a Rural Area in China during the 2009/2010 Influenza Pandemic

    PubMed Central

    Zhang, Ying; Li, Lin; Dong, Xiaochun; Kong, Mei; Gao, Lu; Dong, Xiaojing; Xu, Wenti

    2014-01-01

    Background Most influenza surveillance is based on data from urban sentinel hospitals; little is known about influenza activity in rural communities. We conducted influenza surveillance in a rural region of China with the aim of detecting influenza activity in the 2009/2010 influenza season. Methods The study was conducted from October 2009 to March 2010. Real-time polymerase chain reaction was used to confirm influenza cases. Over-the-counter (OTC) drug sales were daily collected in drugstores and hospitals/clinics. Space-time scan statistics were used to identify clusters of ILI in community. The incidence rate of ILI/influenza was estimated on the basis of the number of ILI/influenza cases detected by the hospitals/clinics. Results A total of 434 ILI cases (3.88% of all consultations) were reported; 64.71% of these cases were influenza A (H1N1) pdm09. The estimated incidence rate of ILI and influenza were 5.19/100 and 0.40/100, respectively. The numbers of ILI cases and OTC drug purchases in the previous 7 days were strongly correlated (Spearman rank correlation coefficient [r] = 0.620, P = 0.001). Four ILI outbreaks were detected by space-time permutation analysis. Conclusions This rural community surveillance detected influenza A (H1N1) pdm09 activity and outbreaks in the 2009/2010 influenza season and enabled estimation of the incidence rate of influenza. It also provides a scientific data for public health measures. PMID:25542003

  1. Sampling solution traces for the problem of sorting permutations by signed reversals

    PubMed Central

    2012-01-01

    Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580

  2. Set-Based Discrete Particle Swarm Optimization Based on Decomposition for Permutation-Based Multiobjective Combinatorial Optimization Problems.

    PubMed

    Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun

    2018-07-01

    This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.

  3. Brain structural plasticity with spaceflight.

    PubMed

    Koppelmans, Vincent; Bloomberg, Jacob J; Mulavara, Ajitkumar P; Seidler, Rachael D

    2016-01-01

    Humans undergo extensive sensorimotor adaptation during spaceflight due to altered vestibular inputs and body unloading. No studies have yet evaluated the effects of spaceflight on human brain structure despite the fact that recently reported optic nerve structural changes are hypothesized to occur due to increased intracranial pressure occurring with microgravity. This is the first report on human brain structural changes with spaceflight. We evaluated retrospective longitudinal T2-weighted MRI scans and balance data from 27 astronauts (thirteen ~2-week shuttle crew members and fourteen ~6-month International Space Station crew members) to determine spaceflight effects on brain structure, and whether any pre to postflight brain changes are associated with balance changes. Data were obtained from the NASA Lifetime Surveillance of Astronaut Health. Brain scans were segmented into gray matter maps and normalized into MNI space using a stepwise approach through subject specific templates. Non-parametric permutation testing was used to analyze pre to postflight volumetric gray matter changes. We found extensive volumetric gray matter decreases, including large areas covering the temporal and frontal poles and around the orbits. This effect was larger in International Space Station versus shuttle crew members in some regions. There were bilateral focal gray matter increases within the medial primary somatosensory and motor cortex; i.e., the cerebral areas where the lower limbs are represented. These intriguing findings are observed in a retrospective data set; future prospective studies should probe the underlying mechanisms and behavioral consequences.

  4. [The application of the prospective space-time statistic in early warning of infectious disease].

    PubMed

    Yin, Fei; Li, Xiao-Song; Feng, Zi-Jian; Ma, Jia-Qi

    2007-06-01

    To investigate the application of prospective space-time scan statistic in the early stage of detecting infectious disease outbreaks. The prospective space-time scan statistic was tested by mimicking daily prospective analyses of bacillary dysentery data of Chengdu city in 2005 (3212 cases in 102 towns and villages). And the results were compared with that of purely temporal scan statistic. The prospective space-time scan statistic could give specific messages both in spatial and temporal. The results of June indicated that the prospective space-time scan statistic could timely detect the outbreaks that started from the local site, and the early warning message was powerful (P = 0.007). When the merely temporal scan statistic for detecting the outbreak was sent two days later, and the signal was less powerful (P = 0.039). The prospective space-time scan statistic could make full use of the spatial and temporal information in infectious disease data and could timely and effectively detect the outbreaks that start from the local sites. The prospective space-time scan statistic could be an important tool for local and national CDC to set up early detection surveillance systems.

  5. Spatio-temporal epidemiology of the cholera outbreak in Papua New Guinea, 2009-2011.

    PubMed

    Horwood, Paul F; Karl, Stephan; Mueller, Ivo; Jonduo, Marinjho H; Pavlin, Boris I; Dagina, Rosheila; Ropa, Berry; Bieb, Sibauk; Rosewell, Alexander; Umezaki, Masahiro; Siba, Peter M; Greenhill, Andrew R

    2014-08-20

    Cholera continues to be a devastating disease in many developing countries where inadequate safe water supply and poor sanitation facilitate spread. From July 2009 until late 2011 Papua New Guinea experienced the first outbreak of cholera recorded in the country, resulting in >15,500 cases and >500 deaths. Using the national cholera database, we analysed the spatio-temporal distribution and clustering of the Papua New Guinea cholera outbreak. The Kulldorff space-time permutation scan statistic, contained in the software package SatScan v9.2 was used to describe the first 8 weeks of the outbreak in Morobe Province before cholera cases spread throughout other regions of the country. Data were aggregated at the provincial level to describe the spread of the disease to other affected provinces. Spatio-temporal and cluster analyses revealed that the outbreak was characterized by three distinct phases punctuated by explosive propagation of cases when the outbreak spread to a new region. The lack of road networks across most of Papua New Guinea is likely to have had a major influence on the slow spread of the disease during this outbreak. Identification of high risk areas and the likely mode of spread can guide government health authorities to formulate public health strategies to mitigate the spread of the disease through education campaigns, vaccination, increased surveillance in targeted areas and interventions to improve water, sanitation and hygiene.

  6. A flexibly shaped space-time scan statistic for disease outbreak detection and monitoring.

    PubMed

    Takahashi, Kunihiko; Kulldorff, Martin; Tango, Toshiro; Yih, Katherine

    2008-04-11

    Early detection of disease outbreaks enables public health officials to implement disease control and prevention measures at the earliest possible time. A time periodic geographical disease surveillance system based on a cylindrical space-time scan statistic has been used extensively for disease surveillance along with the SaTScan software. In the purely spatial setting, many different methods have been proposed to detect spatial disease clusters. In particular, some spatial scan statistics are aimed at detecting irregularly shaped clusters which may not be detected by the circular spatial scan statistic. Based on the flexible purely spatial scan statistic, we propose a flexibly shaped space-time scan statistic for early detection of disease outbreaks. The performance of the proposed space-time scan statistic is compared with that of the cylindrical scan statistic using benchmark data. In order to compare their performances, we have developed a space-time power distribution by extending the purely spatial bivariate power distribution. Daily syndromic surveillance data in Massachusetts, USA, are used to illustrate the proposed test statistic. The flexible space-time scan statistic is well suited for detecting and monitoring disease outbreaks in irregularly shaped areas.

  7. 3D PATTERN OF BRAIN ABNORMALITIES IN FRAGILE X SYNDROME VISUALIZED USING TENSOR-BASED MORPHOMETRY

    PubMed Central

    Lee, Agatha D.; Leow, Alex D.; Lu, Allen; Reiss, Allan L.; Hall, Scott; Chiang, Ming-Chang; Toga, Arthur W.; Thompson, Paul M.

    2007-01-01

    Fragile X syndrome (FraX), a genetic neurodevelopmental disorder, results in impaired cognition with particular deficits in executive function and visuo-spatial skills. Here we report the first detailed 3D maps of the effects of the Fragile X mutation on brain structure, using tensor-based morphometry. TBM visualizes structural brain deficits automatically, without time-consuming specification of regions-of-interest. We compared 36 subjects with FraX (age: 14.66+/−1.58SD, 18 females/18 males), and 33 age-matched healthy controls (age: 14.67+/−2.2SD, 17 females/16 males), using high-dimensional elastic image registration. All 69 subjects' 3D T1-weighted brain MRIs were spatially deformed to match a high-resolution single-subject average MRI scan in ICBM space, whose geometry was optimized to produce a minimal deformation target. Maps of the local Jacobian determinant (expansion factor) were computed from the deformation fields. Statistical maps showed increased caudate (10% higher; p=0.001) and lateral ventricle volumes (19% higher; p=0.003), and trend-level parietal and temporal white matter excesses (10% higher locally; p=0.04). In affected females, volume abnormalities correlated with reduction in systemically measured levels of the fragile X mental retardation protein (FMRP; Spearman's r<−0.5 locally). Decreased FMRP correlated with ventricular expansion (p=0.042; permutation test), and anterior cingulate tissue reductions (p=0.0026; permutation test) supporting theories that FMRP is required for normal dendritic pruning in fronto-striatal-limbic pathways. No sex differences were found; findings were confirmed using traditional volumetric measures in regions of interest. Deficit patterns were replicated using Lie group statistics optimized for tensor-valued data. Investigation of how these anomalies emerge over time will accelerate our understanding of FraX and its treatment. PMID:17161622

  8. Space-time clusters for early detection of grizzly bear predation.

    PubMed

    Kermish-Wells, Joseph; Massolo, Alessandro; Stenhouse, Gordon B; Larsen, Terrence A; Musiani, Marco

    2018-01-01

    Accurate detection and classification of predation events is important to determine predation and consumption rates by predators. However, obtaining this information for large predators is constrained by the speed at which carcasses disappear and the cost of field data collection. To accurately detect predation events, researchers have used GPS collar technology combined with targeted site visits. However, kill sites are often investigated well after the predation event due to limited data retrieval options on GPS collars (VHF or UHF downloading) and to ensure crew safety when working with large predators. This can lead to missing information from small-prey (including young ungulates) kill sites due to scavenging and general site deterioration (e.g., vegetation growth). We used a space-time permutation scan statistic (STPSS) clustering method (SaTScan) to detect predation events of grizzly bears ( Ursus arctos ) fitted with satellite transmitting GPS collars. We used generalized linear mixed models to verify predation events and the size of carcasses using spatiotemporal characteristics as predictors. STPSS uses a probability model to compare expected cluster size (space and time) with the observed size. We applied this method retrospectively to data from 2006 to 2007 to compare our method to random GPS site selection. In 2013-2014, we applied our detection method to visit sites one week after their occupation. Both datasets were collected in the same study area. Our approach detected 23 of 27 predation sites verified by visiting 464 random grizzly bear locations in 2006-2007, 187 of which were within space-time clusters and 277 outside. Predation site detection increased by 2.75 times (54 predation events of 335 visited clusters) using 2013-2014 data. Our GLMMs showed that cluster size and duration predicted predation events and carcass size with high sensitivity (0.72 and 0.94, respectively). Coupling GPS satellite technology with clusters using a program based on space-time probability models allows for prompt visits to predation sites. This enables accurate identification of the carcass size and increases fieldwork efficiency in predation studies.

  9. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  10. An analog scrambler for speech based on sequential permutations in time and frequency

    NASA Astrophysics Data System (ADS)

    Cox, R. V.; Jayant, N. S.; McDermott, B. J.

    Permutation of speech segments is an operation that is frequently used in the design of scramblers for analog speech privacy. In this paper, a sequential procedure for segment permutation is considered. This procedure can be extended to two dimensional permutation of time segments and frequency bands. By subjective testing it is shown that this combination gives a residual intelligibility for spoken digits of 20 percent with a delay of 256 ms. (A lower bound for this test would be 10 percent). The complexity of implementing such a system is considered and the issues of synchronization and channel equalization are addressed. The computer simulation results for the system using both real and simulated channels are examined.

  11. Permutation glass.

    PubMed

    Williams, Mobolaji

    2018-01-01

    The field of disordered systems in statistical physics provides many simple models in which the competing influences of thermal and nonthermal disorder lead to new phases and nontrivial thermal behavior of order parameters. In this paper, we add a model to the subject by considering a disordered system where the state space consists of various orderings of a list. As in spin glasses, the disorder of such "permutation glasses" arises from a parameter in the Hamiltonian being drawn from a distribution of possible values, thus allowing nominally "incorrect orderings" to have lower energies than "correct orderings" in the space of permutations. We analyze a Gaussian, uniform, and symmetric Bernoulli distribution of energy costs, and, by employing Jensen's inequality, derive a simple condition requiring the permutation glass to always transition to the correctly ordered state at a temperature lower than that of the nondisordered system, provided that this correctly ordered state is accessible. We in turn find that in order for the correctly ordered state to be accessible, the probability that an incorrectly ordered component is energetically favored must be less than the inverse of the number of components in the system. We show that all of these results are consistent with a replica symmetric ansatz of the system. We conclude by arguing that there is no distinct permutation glass phase for the simplest model considered here and by discussing how to extend the analysis to more complex Hamiltonians capable of novel phase behavior and replica symmetry breaking. Finally, we outline an apparent correspondence between the presented system and a discrete-energy-level fermion gas. In all, the investigation introduces a class of exactly soluble models into statistical mechanics and provides a fertile ground to investigate statistical models of disorder.

  12. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  13. Visual recognition of permuted words

    NASA Astrophysics Data System (ADS)

    Rashid, Sheikh Faisal; Shafait, Faisal; Breuel, Thomas M.

    2010-02-01

    In current study we examine how letter permutation affects in visual recognition of words for two orthographically dissimilar languages, Urdu and German. We present the hypothesis that recognition or reading of permuted and non-permuted words are two distinct mental level processes, and that people use different strategies in handling permuted words as compared to normal words. A comparison between reading behavior of people in these languages is also presented. We present our study in context of dual route theories of reading and it is observed that the dual-route theory is consistent with explanation of our hypothesis of distinction in underlying cognitive behavior for reading permuted and non-permuted words. We conducted three experiments in lexical decision tasks to analyze how reading is degraded or affected by letter permutation. We performed analysis of variance (ANOVA), distribution free rank test, and t-test to determine the significance differences in response time latencies for two classes of data. Results showed that the recognition accuracy for permuted words is decreased 31% in case of Urdu and 11% in case of German language. We also found a considerable difference in reading behavior for cursive and alphabetic languages and it is observed that reading of Urdu is comparatively slower than reading of German due to characteristics of cursive script.

  14. Sorting signed permutations by inversions in O(nlogn) time.

    PubMed

    Swenson, Krister M; Rajan, Vaibhav; Lin, Yu; Moret, Bernard M E

    2010-03-01

    The study of genomic inversions (or reversals) has been a mainstay of computational genomics for nearly 20 years. After the initial breakthrough of Hannenhalli and Pevzner, who gave the first polynomial-time algorithm for sorting signed permutations by inversions, improved algorithms have been designed, culminating with an optimal linear-time algorithm for computing the inversion distance and a subquadratic algorithm for providing a shortest sequence of inversions--also known as sorting by inversions. Remaining open was the question of whether sorting by inversions could be done in O(nlogn) time. In this article, we present a qualified answer to this question, by providing two new sorting algorithms, a simple and fast randomized algorithm and a deterministic refinement. The deterministic algorithm runs in time O(nlogn + kn), where k is a data-dependent parameter. We provide the results of extensive experiments showing that both the average and the standard deviation for k are small constants, independent of the size of the permutation. We conclude (but do not prove) that almost all signed permutations can be sorted by inversions in O(nlogn) time.

  15. Permuting input for more effective sampling of 3D conformer space

    NASA Astrophysics Data System (ADS)

    Carta, Giorgio; Onnis, Valeria; Knox, Andrew J. S.; Fayne, Darren; Lloyd, David G.

    2006-03-01

    SMILES strings and other classic 2D structural formats offer a convenient way to represent molecules as a simplistic connection table, with the inherent advantages of ease of handling and storage. In the context of virtual screening, chemical databases to be screened are often initially represented by canonicalised SMILES strings that can be filtered and pre-processed in a number of ways, resulting in molecules that occupy similar regions of chemical space to active compounds of a therapeutic target. A wide variety of software exists to convert molecules into SMILES format, namely, Mol2smi (Daylight Inc.), MOE (Chemical Computing Group) and Babel (Openeye Scientific Software). Depending on the algorithm employed, the atoms of a SMILES string defining a molecule can be ordered differently. Upon conversion to 3D coordinates they result in the production of ostensibly the same molecule. In this work we show how different permutations of a SMILES string can affect conformer generation, affecting reliability and repeatability of the results. Furthermore, we propose a novel procedure for the generation of conformers, taking advantage of the permutation of the input strings—both SMILES and other 2D formats, leading to more effective sampling of conformation space in output, and also implementing fingerprint and principal component analyses step to post process and visualise the results.

  16. Permutation entropy of fractional Brownian motion and fractional Gaussian noise

    NASA Astrophysics Data System (ADS)

    Zunino, L.; Pérez, D. G.; Martín, M. T.; Garavaglia, M.; Plastino, A.; Rosso, O. A.

    2008-06-01

    We have worked out theoretical curves for the permutation entropy of the fractional Brownian motion and fractional Gaussian noise by using the Bandt and Shiha [C. Bandt, F. Shiha, J. Time Ser. Anal. 28 (2007) 646] theoretical predictions for their corresponding relative frequencies. Comparisons with numerical simulations show an excellent agreement. Furthermore, the entropy-gap in the transition between these processes, observed previously via numerical results, has been here theoretically validated. Also, we have analyzed the behaviour of the permutation entropy of the fractional Gaussian noise for different time delays.

  17. Successful attack on permutation-parity-machine-based neural cryptography.

    PubMed

    Seoane, Luís F; Ruttor, Andreas

    2012-02-01

    An algorithm is presented which implements a probabilistic attack on the key-exchange protocol based on permutation parity machines. Instead of imitating the synchronization of the communicating partners, the strategy consists of a Monte Carlo method to sample the space of possible weights during inner rounds and an analytic approach to convey the extracted information from one outer round to the next one. The results show that the protocol under attack fails to synchronize faster than an eavesdropper using this algorithm.

  18. Statistical physics of the symmetric group.

    PubMed

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  19. Statistical physics of the symmetric group

    NASA Astrophysics Data System (ADS)

    Williams, Mobolaji

    2017-04-01

    Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.

  20. Precursor and Neutral Loss Scans in an RF Scanning Linear Quadrupole Ion Trap

    NASA Astrophysics Data System (ADS)

    Snyder, Dalton T.; Szalwinski, Lucas J.; Schrader, Robert L.; Pirro, Valentina; Hilger, Ryan; Cooks, R. Graham

    2018-03-01

    Methodology for performing precursor and neutral loss scans in an RF scanning linear quadrupole ion trap is described and compared to the unconventional ac frequency scan technique. In the RF scanning variant, precursor ions are mass selectively excited by a fixed frequency resonance excitation signal at low Mathieu q while the RF amplitude is ramped linearly to pass ions through the point of excitation such that the excited ion's m/z varies linearly with time. Ironically, a nonlinear ac frequency scan is still required for ejection of the product ions since their frequencies vary nonlinearly with the linearly varying RF amplitude. In the case of the precursor scan, the ejection frequency must be scanned so that it is fixed on a product ion m/z throughout the RF scan, whereas in the neutral loss scan, it must be scanned to maintain a constant mass offset from the excited precursor ions. Both simultaneous and sequential permutation scans are possible; only the former are demonstrated here. The scans described are performed on a variety of samples using different ionization sources: protonated amphetamine ions generated by nanoelectrospray ionization (nESI), explosives ionized by low-temperature plasma (LTP), and chemical warfare agent simulants sampled from a surface and analyzed with swab touch spray (TS). We lastly conclude that the ac frequency scan variant of these MS/MS scans is preferred due to electronic simplicity. In an accompanying manuscript, we thus describe the implementation of orthogonal double resonance precursor and neutral loss scans on the Mini 12 using constant RF voltage. [Figure not available: see fulltext.

  1. Permutational symmetries for coincidence rates in multimode multiphotonic interferometry

    NASA Astrophysics Data System (ADS)

    Khalid, Abdullah; Spivak, Dylan; Sanders, Barry C.; de Guise, Hubert

    2018-06-01

    We obtain coincidence rates for passive optical interferometry by exploiting the permutational symmetries of partially distinguishable input photons, and our approach elucidates qualitative features of multiphoton coincidence landscapes. We treat the interferometer input as a product state of any number of photons in each input mode with photons distinguished by their arrival time. Detectors at the output of the interferometer count photons from each output mode over a long integration time. We generalize and prove the claim of Tillmann et al. [Phys. Rev. X 5, 041015 (2015), 10.1103/PhysRevX.5.041015] that coincidence rates can be elegantly expressed in terms of immanants. Immanants are functions of matrices that exhibit permutational symmetries and the immanants appearing in our coincidence-rate expressions share permutational symmetries with the input state. Our results are obtained by employing representation theory of the symmetric group to analyze systems of an arbitrary number of photons in arbitrarily sized interferometers.

  2. Estimation of absolute solvent and solvation shell entropies via permutation reduction

    NASA Astrophysics Data System (ADS)

    Reinhard, Friedemann; Grubmüller, Helmut

    2007-01-01

    Despite its prominent contribution to the free energy of solvated macromolecules such as proteins or DNA, and although principally contained within molecular dynamics simulations, the entropy of the solvation shell is inaccessible to straightforward application of established entropy estimation methods. The complication is twofold. First, the configurational space density of such systems is too complex for a sufficiently accurate fit. Second, and in contrast to the internal macromolecular dynamics, the configurational space volume explored by the diffusive motion of the solvent molecules is too large to be exhaustively sampled by current simulation techniques. Here, we develop a method to overcome the second problem and to significantly alleviate the first one. We propose to exploit the permutation symmetry of the solvent by transforming the trajectory in a way that renders established estimation methods applicable, such as the quasiharmonic approximation or principal component analysis. Our permutation-reduced approach involves a combinatorial problem, which is solved through its equivalence with the linear assignment problem, for which O(N3) methods exist. From test simulations of dense Lennard-Jones gases, enhanced convergence and improved entropy estimates are obtained. Moreover, our approach renders diffusive systems accessible to improved fit functions.

  3. A Comparison of Techniques for Scheduling Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2004-01-01

    Scheduling observations by coordinated fleets of Earth Observing Satellites (EOS) involves large search spaces, complex constraints and poorly understood bottlenecks, conditions where evolutionary and related algorithms are often effective. However, there are many such algorithms and the best one to use is not clear. Here we compare multiple variants of the genetic algorithm: stochastic hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on ten realistically-sized EOS scheduling problems. Schedules are represented by a permutation (non-temperal ordering) of the observation requests. A simple deterministic scheduler assigns times and resources to each observation request in the order indicated by the permutation, discarding those that violate the constraints created by previously scheduled observations. Simulated annealing performs best. Random mutation outperform a more 'intelligent' mutator. Furthermore, the best mutator, by a small margin, was a novel approach we call temperature dependent random sampling that makes large changes in the early stages of evolution and smaller changes towards the end of search.

  4. Color image encryption based on color blend and chaos permutation in the reality-preserving multiple-parameter fractional Fourier transform domain

    NASA Astrophysics Data System (ADS)

    Lang, Jun

    2015-03-01

    In this paper, we propose a novel color image encryption method by using Color Blend (CB) and Chaos Permutation (CP) operations in the reality-preserving multiple-parameter fractional Fourier transform (RPMPFRFT) domain. The original color image is first exchanged and mixed randomly from the standard red-green-blue (RGB) color space to R‧G‧B‧ color space by rotating the color cube with a random angle matrix. Then RPMPFRFT is employed for changing the pixel values of color image, three components of the scrambled RGB color space are converted by RPMPFRFT with three different transform pairs, respectively. Comparing to the complex output transform, the RPMPFRFT transform ensures that the output is real which can save storage space of image and convenient for transmission in practical applications. To further enhance the security of the encryption system, the output of the former steps is scrambled by juxtaposition of sections of the image in the reality-preserving multiple-parameter fractional Fourier domains and the alignment of sections is determined by two coupled chaotic logistic maps. The parameters in the Color Blend, Chaos Permutation and the RPMPFRFT transform are regarded as the key in the encryption algorithm. The proposed color image encryption can also be applied to encrypt three gray images by transforming the gray images into three RGB color components of a specially constructed color image. Numerical simulations are performed to demonstrate that the proposed algorithm is feasible, secure, sensitive to keys and robust to noise attack and data loss.

  5. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  6. A space-time scan statistic for detecting emerging outbreaks.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko; Kohriyama, Kazuaki

    2011-03-01

    As a major analytical method for outbreak detection, Kulldorff's space-time scan statistic (2001, Journal of the Royal Statistical Society, Series A 164, 61-72) has been implemented in many syndromic surveillance systems. Since, however, it is based on circular windows in space, it has difficulty correctly detecting actual noncircular clusters. Takahashi et al. (2008, International Journal of Health Geographics 7, 14) proposed a flexible space-time scan statistic with the capability of detecting noncircular areas. It seems to us, however, that the detection of the most likely cluster defined in these space-time scan statistics is not the same as the detection of localized emerging disease outbreaks because the former compares the observed number of cases with the conditional expected number of cases. In this article, we propose a new space-time scan statistic which compares the observed number of cases with the unconditional expected number of cases, takes a time-to-time variation of Poisson mean into account, and implements an outbreak model to capture localized emerging disease outbreaks more timely and correctly. The proposed models are illustrated with data from weekly surveillance of the number of absentees in primary schools in Kitakyushu-shi, Japan, 2006. © 2010, The International Biometric Society.

  7. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  8. Longitudinal stability of MRI for mapping brain change using tensor-based morphometry.

    PubMed

    Leow, Alex D; Klunder, Andrea D; Jack, Clifford R; Toga, Arthur W; Dale, Anders M; Bernstein, Matt A; Britson, Paula J; Gunter, Jeffrey L; Ward, Chadwick P; Whitwell, Jennifer L; Borowski, Bret J; Fleisher, Adam S; Fox, Nick C; Harvey, Danielle; Kornak, John; Schuff, Norbert; Studholme, Colin; Alexander, Gene E; Weiner, Michael W; Thompson, Paul M

    2006-06-01

    Measures of brain change can be computed from sequential MRI scans, providing valuable information on disease progression, e.g., for patient monitoring and drug trials. Tensor-based morphometry (TBM) creates maps of these brain changes, visualizing the 3D profile and rates of tissue growth or atrophy, but its sensitivity depends on the contrast and geometric stability of the images. As part of the Alzheimer's Disease Neuroimaging Initiative (ADNI), 17 normal elderly subjects were scanned twice (at a 2-week interval) with several 3D 1.5 T MRI pulse sequences: high and low flip angle SPGR/FLASH (from which Synthetic T1 images were generated), MP-RAGE, IR-SPGR (N = 10) and MEDIC (N = 7) scans. For each subject and scan type, a 3D deformation map aligned baseline and follow-up scans, computed with a nonlinear, inverse-consistent elastic registration algorithm. Voxelwise statistics, in ICBM stereotaxic space, visualized the profile of mean absolute change and its cross-subject variance; these maps were then compared using permutation testing. Image stability depended on: (1) the pulse sequence; (2) the transmit/receive coil type (birdcage versus phased array); (3) spatial distortion corrections (using MEDIC sequence information); (4) B1-field intensity inhomogeneity correction (using N3). SPGR/FLASH images acquired using a birdcage coil had least overall deviation. N3 correction reduced coil type and pulse sequence differences and improved scan reproducibility, except for Synthetic T1 images (which were intrinsically corrected for B1-inhomogeneity). No strong evidence favored B0 correction. Although SPGR/FLASH images showed least deviation here, pulse sequence selection for the ADNI project was based on multiple additional image analyses, to be reported elsewhere.

  9. Longitudinal stability of MRI for mapping brain change using tensor-based morphometry

    PubMed Central

    Leow, Alex D.; Klunder, Andrea D.; Jack, Clifford R.; Toga, Arthur W.; Dale, Anders M.; Bernstein, Matt A.; Britson, Paula J.; Gunter, Jeffrey L.; Ward, Chadwick P.; Whitwell, Jennifer L.; Borowski, Bret J.; Fleisher, Adam S.; Fox, Nick C.; Harvey, Danielle; Kornak, John; Schuff, Norbert; Studholme, Colin; Alexander, Gene E.; Weiner, Michael W.; Thompson, Paul M.

    2007-01-01

    Measures of brain change can be computed from sequential MRI scans, providing valuable information on disease progression, e.g., for patient monitoring and drug trials. Tensor-based morphometry (TBM) creates maps of these brain changes, visualizing the 3D profile and rates of tissue growth or atrophy, but its sensitivity depends on the contrast and geometric stability of the images. A s part of the Alzheimer’s Disease Neuroimaging Initiative (ADNI), 17 normal elderly subjects were scanned twice (at a 2-week interval) with several 3D 1.5 T MRI pulse sequences: high and low flip angle SPGR/FLASH (from which Synthetic T1 images were generated), MP-RAGE, IR-SPGR (N = 10) and MEDIC (N = 7) scans. For each subject and scan type, a 3D deformation map aligned baseline and follow-up scans, computed with a nonlinear, inverse-consistent elastic registration algorithm. Voxelwise statistics, in ICBM stereotaxic space, visualized the profile of mean absolute change and its cross-subject variance; these maps were then compared using permutation testing. Image stability depended on: (1) the pulse sequence; (2) the transmit/receive coil type (birdcage versus phased array); (3) spatial distortion corrections (using MEDIC sequence information); (4) B1-field intensity inhomogeneity correction (using N3). SPGR/FLASH images acquired using a birdcage coil had least overall deviation. N3 correction reduced coil type and pulse sequence differences and improved scan reproducibility, except for Synthetic T1 images (which were intrinsically corrected for B1-inhomogeneity). No strong evidence favored B0 correction. Although SPGR/FLASH images showed least deviation here, pulse sequence selection for the ADNI project was based on multiple additional image analyses, to be reported elsewhere. PMID:16480900

  10. A faster 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Cunha, Luís Felipe I; Kowada, Luis Antonio B; Hausen, Rodrigo de A; de Figueiredo, Celina M H

    2015-11-01

    Sorting by Transpositions is an NP-hard problem for which several polynomial-time approximation algorithms have been developed. Hartman and Shamir (2006) developed a 1.5-approximation [Formula: see text] algorithm, whose running time was improved to O(nlogn) by Feng and Zhu (2007) with a data structure they defined, the permutation tree. Elias and Hartman (2006) developed a 1.375-approximation O(n(2)) algorithm, and Firoz et al. (2011) claimed an improvement to the running time, from O(n(2)) to O(nlogn), by using the permutation tree. We provide counter-examples to the correctness of Firoz et al.'s strategy, showing that it is not possible to reach a component by sufficient extensions using the method proposed by them. In addition, we propose a 1.375-approximation algorithm, modifying Elias and Hartman's approach with the use of permutation trees and achieving O(nlogn) time.

  11. Hippocampal Structure and Human Cognition: Key Role of Spatial Processing and Evidence Supporting the Efficiency Hypothesis in Females

    ERIC Educational Resources Information Center

    Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martinez, Kenia; Hermel, David; Wang, Yalin; Alvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, Ma. Angeles; Shih, Pei Chun; Thompson, Paul M.

    2013-01-01

    Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests…

  12. EXPLICIT SYMPLECTIC-LIKE INTEGRATORS WITH MIDPOINT PERMUTATIONS FOR SPINNING COMPACT BINARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Junjie; Wu, Xin; Huang, Guoqing

    2017-01-01

    We refine the recently developed fourth-order extended phase space explicit symplectic-like methods for inseparable Hamiltonians using Yoshida’s triple product combined with a midpoint permuted map. The midpoint between the original variables and their corresponding extended variables at every integration step is readjusted as the initial values of the original variables and their corresponding extended ones at the next step integration. The triple-product construction is apparently superior to the composition of two triple products in computational efficiency. Above all, the new midpoint permutations are more effective in restraining the equality of the original variables and their corresponding extended ones at each integration step thanmore » the existing sequent permutations of momenta and coordinates. As a result, our new construction shares the benefit of implicit symplectic integrators in the conservation of the second post-Newtonian Hamiltonian of spinning compact binaries. Especially for the chaotic case, it can work well, but the existing sequent permuted algorithm cannot. When dissipative effects from the gravitational radiation reaction are included, the new symplectic-like method has a secular drift in the energy error of the dissipative system for the orbits that are regular in the absence of radiation, as an implicit symplectic integrator does. In spite of this, it is superior to the same-order implicit symplectic integrator in accuracy and efficiency. The new method is particularly useful in discussing the long-term evolution of inseparable Hamiltonian problems.« less

  13. Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina

    PubMed Central

    Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586

  14. Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.

    PubMed

    Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.

  15. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity.

    PubMed

    Mefford, Melissa A; Zappulla, David C

    2016-01-15

    Telomerase is a specialized ribonucleoprotein complex that extends the 3' ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5' and 3' ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3' of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  16. Cipher image damage and decisions in real time

    NASA Astrophysics Data System (ADS)

    Silva-García, Victor Manuel; Flores-Carapia, Rolando; Rentería-Márquez, Carlos; Luna-Benoso, Benjamín; Jiménez-Vázquez, Cesar Antonio; González-Ramírez, Marlon David

    2015-01-01

    This paper proposes a method for constructing permutations on m position arrangements. Our objective is to encrypt color images using advanced encryption standard (AES), using variable permutations means a different one for each 128-bit block in the first round after the x-or operation is applied. Furthermore, this research offers the possibility of knowing the original image when the encrypted figure suffered a failure from either an attack or not. This is achieved by permuting the original image pixel positions before being encrypted with AES variable permutations, which means building a pseudorandom permutation of 250,000 position arrays or more. To this end, an algorithm that defines a bijective function between the nonnegative integer and permutation sets is built. From this algorithm, the way to build permutations on the 0,1,…,m-1 array, knowing m-1 constants, is presented. The transcendental numbers are used to select these m-1 constants in a pseudorandom way. The quality of the proposed encryption according to the following criteria is evaluated: the correlation coefficient, the entropy, and the discrete Fourier transform. A goodness-of-fit test for each basic color image is proposed to measure the bits randomness degree of the encrypted figure. On the other hand, cipher images are obtained in a loss-less encryption way, i.e., no JPEG file formats are used.

  17. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity

    PubMed Central

    Mefford, Melissa A.

    2015-01-01

    Telomerase is a specialized ribonucleoprotein complex that extends the 3′ ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5′ and 3′ ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3′ of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. PMID:26503788

  18. Evaluation of morphological changes in the adult skull with age and sex.

    PubMed

    Urban, Jillian E; Weaver, Ashley A; Lillie, Elizabeth M; Maldjian, Joseph A; Whitlow, Christopher T; Stitzel, Joel D

    2016-12-01

    The morphology of the brain and skull are important in the evaluation of the aging human; however, little is known about how the skull may change with age. The objective of this study was to evaluate the morphological changes of the adult skull using three-dimensional geometric morphometric analysis of thousands of landmarks with the focus on anatomic regions that may be correlated with brain atrophy and head injury. Computed tomography data were collected between ages 20 and 100. Each scan was segmented using thresholding techniques. An atlas image of a 50th percentile skull was registered to each subject scan by computing a series of rigid, affine, and non-linear transformations between atlas space and subject space. Landmarks on the atlas skull were transformed to each subject and partitioned into the inner and outer cranial vault and the cranial fossae. A generalized Procrustes analysis was completed for the landmark sets. The coordinate locations describing the shape of each region were regressed with age to generate a model predicting the landmark location with age. Permutation testing was performed to assess significant changes with age. For the males, all anatomic regions reveal significant changes in shape with age except for the posterior cranial fossa. For the females, only the middle cranial fossa and anterior cranial fossa were found to change significantly in shape. Results of this study are important for understanding the adult skull and how shape changes may pertain to brain atrophy, aging, and injury. © 2014 Anatomical Society.

  19. Permutation approach, high frequency trading and variety of micro patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Ebrahimian, Mehran; Tahmooresi, Hamed

    2014-11-01

    Permutation approach is suggested as a method to investigate financial time series in micro scales. The method is used to see how high frequency trading in recent years has affected the micro patterns which may be seen in financial time series. Tick to tick exchange rates are considered as examples. It is seen that variety of patterns evolve through time; and that the scale over which the target markets have no dominant patterns, have decreased steadily over time with the emergence of higher frequency trading.

  20. Weighted multiscale Rényi permutation entropy of nonlinear time series

    NASA Astrophysics Data System (ADS)

    Chen, Shijian; Shang, Pengjian; Wu, Yue

    2018-04-01

    In this paper, based on Rényi permutation entropy (RPE), which has been recently suggested as a relative measure of complexity in nonlinear systems, we propose multiscale Rényi permutation entropy (MRPE) and weighted multiscale Rényi permutation entropy (WMRPE) to quantify the complexity of nonlinear time series over multiple time scales. First, we apply MPRE and WMPRE to the synthetic data and make a comparison of modified methods and RPE. Meanwhile, the influence of the change of parameters is discussed. Besides, we interpret the necessity of considering not only multiscale but also weight by taking the amplitude into account. Then MRPE and WMRPE methods are employed to the closing prices of financial stock markets from different areas. By observing the curves of WMRPE and analyzing the common statistics, stock markets are divided into 4 groups: (1) DJI, S&P500, and HSI, (2) NASDAQ and FTSE100, (3) DAX40 and CAC40, and (4) ShangZheng and ShenCheng. Results show that the standard deviations of weighted methods are smaller, showing WMRPE is able to ensure the results more robust. Besides, WMPRE can provide abundant dynamical properties of complex systems, and demonstrate the intrinsic mechanism.

  1. Confidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsy

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; O. Redelico, Francisco

    2018-04-01

    In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.

  2. Exploiting Lipid Permutation Symmetry to Compute Membrane Remodeling Free Energies.

    PubMed

    Bubnis, Greg; Risselada, Herre Jelger; Grubmüller, Helmut

    2016-10-28

    A complete physical description of membrane remodeling processes, such as fusion or fission, requires knowledge of the underlying free energy landscapes, particularly in barrier regions involving collective shape changes, topological transitions, and high curvature, where Canham-Helfrich (CH) continuum descriptions may fail. To calculate these free energies using atomistic simulations, one must address not only the sampling problem due to high free energy barriers, but also an orthogonal sampling problem of combinatorial complexity stemming from the permutation symmetry of identical lipids. Here, we solve the combinatorial problem with a permutation reduction scheme to map a structural ensemble into a compact, nondegenerate subregion of configuration space, thereby permitting straightforward free energy calculations via umbrella sampling. We applied this approach, using a coarse-grained lipid model, to test the CH description of bending and found sharp increases in the bending modulus for curvature radii below 10 nm. These deviations suggest that an anharmonic bending term may be required for CH models to give quantitative energetics of highly curved states.

  3. A permutationally invariant full-dimensional ab initio potential energy surface for the abstraction and exchange channels of the H + CH{sub 4} system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun, E-mail: jli15@cqu.edu.cn, E-mail: zhangdh@dicp.ac.cn; Department of Chemistry and Chemical Biology, University of New Mexico, Albuquerque, New Mexico 87131; Chen, Jun

    2015-05-28

    We report a permutationally invariant global potential energy surface (PES) for the H + CH{sub 4} system based on ∼63 000 data points calculated at a high ab initio level (UCCSD(T)-F12a/AVTZ) using the recently proposed permutation invariant polynomial-neural network method. The small fitting error (5.1 meV) indicates a faithful representation of the ab initio points over a large configuration space. The rate coefficients calculated on the PES using tunneling corrected transition-state theory and quasi-classical trajectory are found to agree well with the available experimental and previous quantum dynamical results. The calculated total reaction probabilities (J{sub tot} = 0) including themore » abstraction and exchange channels using the new potential by a reduced dimensional quantum dynamic method are essentially the same as those on the Xu-Chen-Zhang PES [Chin. J. Chem. Phys. 27, 373 (2014)].« less

  4. Using permutations to detect dependence between time series

    NASA Astrophysics Data System (ADS)

    Cánovas, Jose S.; Guillamón, Antonio; Ruíz, María del Carmen

    2011-07-01

    In this paper, we propose an independence test between two time series which is based on permutations. The proposed test can be carried out by means of different common statistics such as Pearson’s chi-square or the likelihood ratio. We also point out why an exact test is necessary. Simulated and real data (return exchange rates between several currencies) reveal the capacity of this test to detect linear and nonlinear dependences.

  5. Spatio-temporal cluster detection of chickenpox in Valencia, Spain in the period 2008-2012.

    PubMed

    Iftimi, Adina; Martínez-Ruiz, Francisco; Míguez Santiyán, Ana; Montes, Francisco

    2015-05-18

    Chickenpox is a highly contagious airborne disease caused by Varicella zoster, which affects nearly all non-immune children worldwide with an annual incidence estimated at 80-90 million cases. To analyze the spatiotemporal pattern of the chickenpox incidence in the city of Valencia, Spain two complementary statistical approaches were used. First, we evaluated the existence of clusters and spatio-temporal interaction; secondly, we used this information to find the locations of the spatio-temporal clusters via the space-time permutation model. The first method used detects any aggregation in our data but does not provide the spatial and temporal information. The second method gives the locations, areas and time-frame for the spatio-temporal clusters. An overall decreasing time trend, a pronounced 12-monthly periodicity and two complementary periods were observed. Several areas with high incidence, surrounding the center of the city were identified. The existence of aggregation in time and space was observed, and a number of spatio-temporal clusters were located.

  6. Timely detection of localized excess influenza activity in Northern California across patient care, prescription, and laboratory data.

    PubMed

    Greene, Sharon K; Kulldorff, Martin; Huang, Jie; Brand, Richard J; Kleinman, Kenneth P; Hsu, John; Platt, Richard

    2011-02-28

    Timely detection of clusters of localized influenza activity in excess of background seasonal levels could improve situational awareness for public health officials and health systems. However, no single data type may capture influenza activity with optimal sensitivity, specificity, and timeliness, and it is unknown which data types could be most useful for surveillance. We compared the performance of 10 types of electronic clinical data for timely detection of influenza clusters throughout the 2007/08 influenza season in northern California. Kaiser Permanente Northern California generated zip code-specific daily episode counts for: influenza-like illness (ILI) diagnoses in ambulatory care (AC) and emergency departments (ED), both with and without regard to fever; hospital admissions and discharges for pneumonia and influenza; antiviral drugs dispensed (Rx); influenza laboratory tests ordered (Tests); and tests positive for influenza type A (FluA) and type B (FluB). Four credible events of localized excess illness were identified. Prospective surveillance was mimicked within each data stream using a space-time permutation scan statistic, analyzing only data available as of each day, to evaluate the ability and timeliness to detect the credible events. AC without fever and Tests signaled during all four events and, along with Rx, had the most timely signals. FluA had less timely signals. ED, hospitalizations, and FluB did not signal reliably. When fever was included in the ILI definition, signals were either delayed or missed. Although limited to one health plan, location, and year, these results can inform the choice of data streams for public health surveillance of influenza. Copyright © 2010 John Wiley & Sons, Ltd.

  7. Inferring the Presence of Reverse Proxies Through Timing Analysis

    DTIC Science & Technology

    2015-06-01

    16 Figure 3.2 The three different instances of timing measurement configurations 17 Figure 3.3 Permutation of a web request iteration...Their data showed that they could detect at least 6 bits of entropy between unlike devices and that it was enough to determine that they are in fact...depending on the permutation being executed so that every iteration was conducted under the same distance 15 City   Lat   Long   City   Lat   Long

  8. Generalized permutation entropy analysis based on the two-index entropic form S q , δ

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian

    2015-05-01

    Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.

  9. Space-Time Cluster Analysis to Detect Innovative Clinical Practices: A Case Study of Aripiprazole in the Department of Veterans Affairs.

    PubMed

    Penfold, Robert B; Burgess, James F; Lee, Austin F; Li, Mingfei; Miller, Christopher J; Nealon Seibert, Marjorie; Semla, Todd P; Mohr, David C; Kazis, Lewis E; Bauer, Mark S

    2018-02-01

    To identify space-time clusters of changes in prescribing aripiprazole for bipolar disorder among providers in the VA. VA administrative data from 2002 to 2010 were used to identify prescriptions of aripiprazole for bipolar disorder. Prescriber characteristics were obtained using the Personnel and Accounting Integrated Database. We conducted a retrospective space-time cluster analysis using the space-time permutation statistic. All VA service users with a diagnosis of bipolar disorder were included in the patient population. Individuals with any schizophrenia spectrum diagnoses were excluded. We also identified all clinicians who wrote a prescription for any bipolar disorder medication. The study population included 32,630 prescribers. Of these, 8,643 wrote qualifying prescriptions. We identified three clusters of aripiprazole prescribing centered in Massachusetts, Ohio, and the Pacific Northwest. Clusters were associated with prescribing by VA-employed (vs. contracted) prescribers. Nurses with prescribing privileges were more likely to make a prescription for aripiprazole in cluster locations compared with psychiatrists. Primary care physicians were less likely. Early prescribing of aripiprazole for bipolar disorder clustered geographically and was associated with prescriber subgroups. These methods support prospective surveillance of practice changes and identification of associated health system characteristics. © Health Research and Educational Trust.

  10. Unifying the rotational and permutation symmetry of nuclear spin states: Schur-Weyl duality in molecular physics.

    PubMed

    Schmiedt, Hanno; Jensen, Per; Schlemmer, Stephan

    2016-08-21

    In modern physics and chemistry concerned with many-body systems, one of the mainstays is identical-particle-permutation symmetry. In particular, both the intra-molecular dynamics of a single molecule and the inter-molecular dynamics associated, for example, with reactive molecular collisions are strongly affected by selection rules originating in nuclear-permutation symmetry operations being applied to the total internal wavefunctions, including nuclear spin, of the molecules involved. We propose here a general tool to determine coherently the permutation symmetry and the rotational symmetry (associated with the group of arbitrary rotations of the entire molecule in space) of molecular wavefunctions, in particular the nuclear-spin functions. Thus far, these two symmetries were believed to be mutually independent and it has even been argued that under certain circumstances, it is impossible to establish a one-to-one correspondence between them. However, using the Schur-Weyl duality theorem we show that the two types of symmetry are inherently coupled. In addition, we use the ingenious representation-theory technique of Young tableaus to represent the molecular nuclear-spin degrees of freedom in terms of well-defined mathematical objects. This simplifies the symmetry classification of the nuclear wavefunction even for large molecules. Also, the application to reactive collisions is very straightforward and provides a much simplified approach to obtaining selection rules.

  11. Unifying the rotational and permutation symmetry of nuclear spin states: Schur-Weyl duality in molecular physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmiedt, Hanno; Schlemmer, Stephan; Jensen, Per, E-mail: jensen@uni-wuppertal.de

    In modern physics and chemistry concerned with many-body systems, one of the mainstays is identical-particle-permutation symmetry. In particular, both the intra-molecular dynamics of a single molecule and the inter-molecular dynamics associated, for example, with reactive molecular collisions are strongly affected by selection rules originating in nuclear-permutation symmetry operations being applied to the total internal wavefunctions, including nuclear spin, of the molecules involved. We propose here a general tool to determine coherently the permutation symmetry and the rotational symmetry (associated with the group of arbitrary rotations of the entire molecule in space) of molecular wavefunctions, in particular the nuclear-spin functions. Thusmore » far, these two symmetries were believed to be mutually independent and it has even been argued that under certain circumstances, it is impossible to establish a one-to-one correspondence between them. However, using the Schur-Weyl duality theorem we show that the two types of symmetry are inherently coupled. In addition, we use the ingenious representation-theory technique of Young tableaus to represent the molecular nuclear-spin degrees of freedom in terms of well-defined mathematical objects. This simplifies the symmetry classification of the nuclear wavefunction even for large molecules. Also, the application to reactive collisions is very straightforward and provides a much simplified approach to obtaining selection rules.« less

  12. Spatial Epidemiology and Risk Factor Analysis of White Spot Disease in the Shrimp Farming Industry of Sinaloa, Mexico, from 2005 to 2011.

    PubMed

    Muniesa, A; Mardones, F O; Chávez, M C; Montoya, L; Cabanillas, J A; de Blas, I; Martínez-López, B

    2017-10-01

    White spot disease (WSD), caused by the white spot syndrome virus, is currently one of the primary causes of mortality and economic losses in the shrimp farming industry worldwide. In Mexico, shrimp production is one of the most important primary activities generating an annual income of USD 711 million. However, WSD introduction in 1999 had a devastating impact for the Mexican shrimp industry. The aim of this study was to characterize the WSD spatio-temporal patterns and to identify the primary risk factors contributing to WSD occurrence from 2005 to 2011 in Sinaloa, Mexico. We used data collected by the 'Comité Estatal de Sanidad Acuícola de Sinaloa' from 2005 to 2011 regarding WSD outbreaks as well as environmental, production and husbandry factors at farm level. The spatio-temporal patterns of WSD were described using space-time scan statistics. The effect of 52 variables on the time to WSD outbreak occurrence was assessed using a multivariable Cox proportional hazards model. Results reveal that WSD risk and survival time were not homogeneously distributed as suggested by the significant clusters obtained using the space-time permutation model and the space-time exponential model, respectively. The Cox model revealed that the first production cycle [hazard ratio (HR) = 11.31], changes from 1 to 1.4°C of temperature oscillation caused by 'El Niño'/'La Niña' events (HR = 1.44) and high average daily growths (HR = 1.26) were significantly associated with lower survival (i.e. shorter time to WSD outbreak) on farm. Conversely, shrimp weight at the moment of the outbreak (HR = 0.159), changes from -0.9 to -0.5°C of temperature oscillation caused by 'El Niño'/'La Niña' events (HR = 0.540), high superficial water temperature during the pound stocking (HR = 0.823) and high (>100) number of days of culture (HR = 0.830) were factors associated with higher survival. Results are expected to inform the design of risk-based, intervention strategies to minimize the impact of WSD in Mexico. © 2016 Blackwell Verlag GmbH.

  13. Sorting signed permutations by short operations.

    PubMed

    Galvão, Gustavo Rodrigues; Lee, Orlando; Dias, Zanoni

    2015-01-01

    During evolution, global mutations may alter the order and the orientation of the genes in a genome. Such mutations are referred to as rearrangement events, or simply operations. In unichromosomal genomes, the most common operations are reversals, which are responsible for reversing the order and orientation of a sequence of genes, and transpositions, which are responsible for switching the location of two contiguous portions of a genome. The problem of computing the minimum sequence of operations that transforms one genome into another - which is equivalent to the problem of sorting a permutation into the identity permutation - is a well-studied problem that finds application in comparative genomics. There are a number of works concerning this problem in the literature, but they generally do not take into account the length of the operations (i.e. the number of genes affected by the operations). Since it has been observed that short operations are prevalent in the evolution of some species, algorithms that efficiently solve this problem in the special case of short operations are of interest. In this paper, we investigate the problem of sorting a signed permutation by short operations. More precisely, we study four flavors of this problem: (i) the problem of sorting a signed permutation by reversals of length at most 2; (ii) the problem of sorting a signed permutation by reversals of length at most 3; (iii) the problem of sorting a signed permutation by reversals and transpositions of length at most 2; and (iv) the problem of sorting a signed permutation by reversals and transpositions of length at most 3. We present polynomial-time solutions for problems (i) and (iii), a 5-approximation for problem (ii), and a 3-approximation for problem (iv). Moreover, we show that the expected approximation ratio of the 5-approximation algorithm is not greater than 3 for random signed permutations with more than 12 elements. Finally, we present experimental results that show that the approximation ratios of the approximation algorithms cannot be smaller than 3. In particular, this means that the approximation ratio of the 3-approximation algorithm is tight.

  14. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  15. Long Term Performance Metrics of the GD SDR on the SCaN Testbed: The First Year on the ISS

    NASA Technical Reports Server (NTRS)

    Nappier, Jennifer; Wilson, Molly C.

    2014-01-01

    The General Dynamics (GD) S-Band software defined radio (SDR) in the Space Communications and Navigation (SCaN) Testbed on the International Space Station (ISS) provides experimenters an opportunity to develop and demonstrate experimental waveforms in space. The SCaN Testbed was installed on the ISS in August of 2012. After installation, the initial checkout and commissioning phases were completed and experimental operations commenced. One goal of the SCaN Testbed is to collect long term performance metrics for SDRs operating in space in order to demonstrate long term reliability. These metrics include the time the SDR powered on, the time the power amplifier (PA) is powered on, temperature trends, error detection and correction (EDAC) behavior, and waveform operational usage time. This paper describes the performance of the GD SDR over the first year of operations on the ISS.

  16. Altered resting-state connectivity within default mode network associated with late chronotype.

    PubMed

    Horne, Charlotte Mary; Norbury, Ray

    2018-04-20

    Current evidence suggests late chronotype individuals have an increased risk of developing depression. However, the underlying neural mechanisms of this association are not fully understood. Forty-six healthy, right-handed individuals free of current or previous diagnosis of depression, family history of depression or sleep disorder underwent resting-state functional Magnetic Resonance Imaging (rsFMRI). Using an Independent Component Analysis (ICA) approach, the Default Mode Network (DMN) was identified based on a well validated template. Linear effects of chronotype on DMN connectivity were tested for significance using non-parametric permutation tests (applying 5000 permutations). Sleep quality, age, gender, measures of mood and anxiety, time of scan and cortical grey matter volume were included as covariates in the regression model. A significant positive correlation between chronotype and functional connectivity within nodes of the DMN was observed, including; bilateral PCC and precuneus, such that later chronotype (participants with lower rMEQ scores) was associated with decreased connectivity within these regions. The current results appear consistent with altered DMN connectivity in depressed patients and weighted evidence towards reduced DMN connectivity in other at-risk populations which may, in part, explain the increased vulnerability for depression in late chronotype individuals. The effect may be driven by self-critical thoughts associated with late chronotype although future studies are needed to directly investigate this. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. A new Nawaz-Enscore-Ham-based heuristic for permutation flow-shop problems with bicriteria of makespan and machine idle time

    NASA Astrophysics Data System (ADS)

    Liu, Weibo; Jin, Yan; Price, Mark

    2016-10-01

    A new heuristic based on the Nawaz-Enscore-Ham algorithm is proposed in this article for solving a permutation flow-shop scheduling problem. A new priority rule is proposed by accounting for the average, mean absolute deviation, skewness and kurtosis, in order to fully describe the distribution style of processing times. A new tie-breaking rule is also introduced for achieving effective job insertion with the objective of minimizing both makespan and machine idle time. Statistical tests illustrate better solution quality of the proposed algorithm compared to existing benchmark heuristics.

  18. [Local fractal analysis of noise-like time series by all permutations method for 1-115 min periods].

    PubMed

    Panchelyuga, V A; Panchelyuga, M S

    2015-01-01

    Results of local fractal analysis of 329-per-day time series of 239Pu alpha-decay rate fluctuations by means of all permutations method (APM) are presented. The APM-analysis reveals in the time series some steady frequency set. The coincidence of the frequency set with the Earth natural oscillations was demonstrated. A short review of works by different authors who analyzed the time series of fluctuations in processes of different nature is given. We have shown that the periods observed in those works correspond to the periods revealed in our study. It points to a common mechanism of the phenomenon observed.

  19. A Computationally Efficient Hypothesis Testing Method for Epistasis Analysis using Multifactor Dimensionality Reduction

    PubMed Central

    Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2008-01-01

    Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250

  20. Relative risk estimates from spatial and space-time scan statistics: Are they biased?

    PubMed Central

    Prates, Marcos O.; Kulldorff, Martin; Assunção, Renato M.

    2014-01-01

    The purely spatial and space-time scan statistics have been successfully used by many scientists to detect and evaluate geographical disease clusters. Although the scan statistic has high power in correctly identifying a cluster, no study has considered the estimates of the cluster relative risk in the detected cluster. In this paper we evaluate whether there is any bias on these estimated relative risks. Intuitively, one may expect that the estimated relative risks has upward bias, since the scan statistic cherry picks high rate areas to include in the cluster. We show that this intuition is correct for clusters with low statistical power, but with medium to high power the bias becomes negligible. The same behaviour is not observed for the prospective space-time scan statistic, where there is an increasing conservative downward bias of the relative risk as the power to detect the cluster increases. PMID:24639031

  1. Novel permutation measures for image encryption algorithms

    NASA Astrophysics Data System (ADS)

    Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.

    2016-10-01

    This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.

  2. Blocks in cycles and k-commuting permutations.

    PubMed

    Moreno, Rutilo; Rivera, Luis Manuel

    2016-01-01

    We introduce and study k -commuting permutations. One of our main results is a characterization of permutations that k -commute with a given permutation. Using this characterization, we obtain formulas for the number of permutations that k -commute with a permutation [Formula: see text], for some cycle types of [Formula: see text]. Our enumerative results are related with integer sequences in "The On-line Encyclopedia of Integer Sequences", and in some cases provide new interpretations for such sequences.

  3. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  4. A transposase strategy for creating libraries of circularly permuted proteins.

    PubMed

    Mehta, Manan M; Liu, Shirley; Silberg, Jonathan J

    2012-05-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions.

  5. A transposase strategy for creating libraries of circularly permuted proteins

    PubMed Central

    Mehta, Manan M.; Liu, Shirley; Silberg, Jonathan J.

    2012-01-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions. PMID:22319214

  6. Spatial and temporal dynamics of the cardiac mitochondrial proteome.

    PubMed

    Lau, Edward; Huang, Derrick; Cao, Quan; Dincer, T Umut; Black, Caitie M; Lin, Amanda J; Lee, Jessica M; Wang, Ding; Liem, David A; Lam, Maggie P Y; Ping, Peipei

    2015-04-01

    Mitochondrial proteins alter in their composition and quantity drastically through time and space in correspondence to changing energy demands and cellular signaling events. The integrity and permutations of this dynamism are increasingly recognized to impact the functions of the cardiac proteome in health and disease. This article provides an overview on recent advances in defining the spatial and temporal dynamics of mitochondrial proteins in the heart. Proteomics techniques to characterize dynamics on a proteome scale are reviewed and the physiological consequences of altered mitochondrial protein dynamics are discussed. Lastly, we offer our perspectives on the unmet challenges in translating mitochondrial dynamics markers into the clinic.

  7. The Lottery Is a Mathematics Powerball

    ERIC Educational Resources Information Center

    Lim, Vivian; Rubel, Laurie; Shookhoff, Lauren; Sullivan, Mathew; Williams, Sarah

    2016-01-01

    The lottery has rich potential for mathematical explorations. It serves as a real-world context to explore concepts of permutations, combinations, sample space, and probability in terms of making sense of the lottery games. The lottery offers additional possibilities in terms of scaling, data analysis, and spatial analysis. Finally, by readily…

  8. Ensembles of physical states and random quantum circuits on graphs

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo

    2012-11-01

    In this paper we continue and extend the investigations of the ensembles of random physical states introduced in Hamma [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.109.040502 109, 040502 (2012)]. These ensembles are constructed by finite-length random quantum circuits (RQC) acting on the (hyper)edges of an underlying (hyper)graph structure. The latter encodes for the locality structure associated with finite-time quantum evolutions generated by physical, i.e., local, Hamiltonians. Our goal is to analyze physical properties of typical states in these ensembles; in particular here we focus on proxies of quantum entanglement as purity and α-Renyi entropies. The problem is formulated in terms of matrix elements of superoperators which depend on the graph structure, choice of probability measure over the local unitaries, and circuit length. In the α=2 case these superoperators act on a restricted multiqubit space generated by permutation operators associated to the subsets of vertices of the graph. For permutationally invariant interactions the dynamics can be further restricted to an exponentially smaller subspace. We consider different families of RQCs and study their typical entanglement properties for finite time as well as their asymptotic behavior. We find that area law holds in average and that the volume law is a typical property (that is, it holds in average and the fluctuations around the average are vanishing for the large system) of physical states. The area law arises when the evolution time is O(1) with respect to the size L of the system, while the volume law arises as is typical when the evolution time scales like O(L).

  9. Genome Scan Meta-Analysis of Schizophrenia and Bipolar Disorder, Part II: Schizophrenia

    PubMed Central

    Lewis, Cathryn M.; Levinson, Douglas F.; Wise, Lesley H.; DeLisi, Lynn E.; Straub, Richard E.; Hovatta, Iiris; Williams, Nigel M.; Schwab, Sibylle G.; Pulver, Ann E.; Faraone, Stephen V.; Brzustowicz, Linda M.; Kaufmann, Charles A.; Garver, David L.; Gurling, Hugh M. D.; Lindholm, Eva; Coon, Hilary; Moises, Hans W.; Byerley, William; Shaw, Sarah H.; Mesen, Andrea; Sherrington, Robin; O’Neill, F. Anthony; Walsh, Dermot; Kendler, Kenneth S.; Ekelund, Jesper; Paunio, Tiina; Lönnqvist, Jouko; Peltonen, Leena; O’Donovan, Michael C.; Owen, Michael J.; Wildenauer, Dieter B.; Maier, Wolfgang; Nestadt, Gerald; Blouin, Jean-Louis; Antonarakis, Stylianos E.; Mowry, Bryan J.; Silverman, Jeremy M.; Crowe, Raymond R.; Cloninger, C. Robert; Tsuang, Ming T.; Malaspina, Dolores; Harkavy-Friedman, Jill M.; Svrakic, Dragan M.; Bassett, Anne S.; Holcomb, Jennifer; Kalsi, Gursharan; McQuillin, Andrew; Brynjolfson, Jon; Sigmundsson, Thordur; Petursson, Hannes; Jazin, Elena; Zoëga, Tomas; Helgason, Tomas

    2003-01-01

    Schizophrenia is a common disorder with high heritability and a 10-fold increase in risk to siblings of probands. Replication has been inconsistent for reports of significant genetic linkage. To assess evidence for linkage across studies, rank-based genome scan meta-analysis (GSMA) was applied to data from 20 schizophrenia genome scans. Each marker for each scan was assigned to 1 of 120 30-cM bins, with the bins ranked by linkage scores (1 = most significant) and the ranks averaged across studies (Ravg) and then weighted for sample size (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} \\begin{equation*}\\sqrt{N[affected cases]}\\end{equation*}\\end{document}). A permutation test was used to compute the probability of observing, by chance, each bin’s average rank (PAvgRnk) or of observing it for a bin with the same place (first, second, etc.) in the order of average ranks in each permutation (Pord). The GSMA produced significant genomewide evidence for linkage on chromosome 2q (PAvgRnk<.000417). Two aggregate criteria for linkage were also met (clusters of nominally significant P values that did not occur in 1,000 replicates of the entire data set with no linkage present): 12 consecutive bins with both PAvgRnk and Pord<.05, including regions of chromosomes 5q, 3p, 11q, 6p, 1q, 22q, 8p, 20q, and 14p, and 19 consecutive bins with Pord<.05, additionally including regions of chromosomes 16q, 18q, 10p, 15q, 6q, and 17q. There is greater consistency of linkage results across studies than has been previously recognized. The results suggest that some or all of these regions contain loci that increase susceptibility to schizophrenia in diverse populations. PMID:12802786

  10. Finite state model and compatibility theory - New analysis tools for permutation networks

    NASA Technical Reports Server (NTRS)

    Huang, S.-T.; Tripathi, S. K.

    1986-01-01

    A simple model to describe the fundamental operation theory of shuffle-exchange-type permutation networks, the finite permutation machine (FPM), is described, and theorems which transform the control matrix result to a continuous compatible vector result are developed. It is found that only 2n-1 shuffle exchange passes are necessary, and that 3n-3 passes are sufficient, to realize all permutations, reducing the sufficient number of passes by two from previous results. The flexibility of the approach is demonstrated by the description of a stack permutation machine (SPM) which can realize all permutations, and by showing that the FPM corresponding to the Benes (1965) network belongs to the SPM. The FPM corresponding to the network with two cascaded reverse-exchange networks is found to realize all permutations, and a simple mechanism to verify several equivalence relationships of various permutation networks is discussed.

  11. Sorting permutations by prefix and suffix rearrangements.

    PubMed

    Lintzmayer, Carla Negri; Fertin, Guillaume; Dias, Zanoni

    2017-02-01

    Some interesting combinatorial problems have been motivated by genome rearrangements, which are mutations that affect large portions of a genome. When we represent genomes as permutations, the goal is to transform a given permutation into the identity permutation with the minimum number of rearrangements. When they affect segments from the beginning (respectively end) of the permutation, they are called prefix (respectively suffix) rearrangements. This paper presents results for rearrangement problems that involve prefix and suffix versions of reversals and transpositions considering unsigned and signed permutations. We give 2-approximation and ([Formula: see text])-approximation algorithms for these problems, where [Formula: see text] is a constant divided by the number of breakpoints (pairs of consecutive elements that should not be consecutive in the identity permutation) in the input permutation. We also give bounds for the diameters concerning these problems and provide ways of improving the practical results of our algorithms.

  12. Convergence to equilibrium under a random Hamiltonian.

    PubMed

    Brandão, Fernando G S L; Ćwikliński, Piotr; Horodecki, Michał; Horodecki, Paweł; Korbicz, Jarosław K; Mozrzymas, Marek

    2012-09-01

    We analyze equilibration times of subsystems of a larger system under a random total Hamiltonian, in which the basis of the Hamiltonian is drawn from the Haar measure. We obtain that the time of equilibration is of the order of the inverse of the arithmetic average of the Bohr frequencies. To compute the average over a random basis, we compute the inverse of a matrix of overlaps of operators which permute four systems. We first obtain results on such a matrix for a representation of an arbitrary finite group and then apply it to the particular representation of the permutation group under consideration.

  13. Convergence to equilibrium under a random Hamiltonian

    NASA Astrophysics Data System (ADS)

    Brandão, Fernando G. S. L.; Ćwikliński, Piotr; Horodecki, Michał; Horodecki, Paweł; Korbicz, Jarosław K.; Mozrzymas, Marek

    2012-09-01

    We analyze equilibration times of subsystems of a larger system under a random total Hamiltonian, in which the basis of the Hamiltonian is drawn from the Haar measure. We obtain that the time of equilibration is of the order of the inverse of the arithmetic average of the Bohr frequencies. To compute the average over a random basis, we compute the inverse of a matrix of overlaps of operators which permute four systems. We first obtain results on such a matrix for a representation of an arbitrary finite group and then apply it to the particular representation of the permutation group under consideration.

  14. A Reversible Logical Circuit Synthesis Algorithm Based on Decomposition of Cycle Representations of Permutations

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Li, Zhiqiang; Zhang, Gaoman; Pan, Suhan; Zhang, Wei

    2018-05-01

    A reversible function is isomorphic to a permutation and an arbitrary permutation can be represented by a series of cycles. A new synthesis algorithm for 3-qubit reversible circuits was presented. It consists of two parts, the first part used the Number of reversible function's Different Bits (NDBs) to decide whether the NOT gate should be added to decrease the Hamming distance of the input and output vectors; the second part was based on the idea of exploring properties of the cycle representation of permutations, decomposed the cycles to make the permutation closer to the identity permutation and finally turn into the identity permutation, it was realized by using totally controlled Toffoli gates with positive and negative controls.

  15. Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding.

    PubMed

    Zhang, Xuncai; Han, Feng; Niu, Ying

    2017-01-01

    With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis.

  16. Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding

    PubMed Central

    2017-01-01

    With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis. PMID:28912802

  17. Conditional Bounds on Polarization Transfer

    NASA Astrophysics Data System (ADS)

    Nielsen, N. C.; Sorensen, O. W.

    The implications of constraints on unitary transformations of spin operators with respect to the accessible regions of Liouville space are analyzed. Specifically, the effects of spin-permutation symmetry on the unitary propagators are investigated. The influence of S2 and S3 propagator symmetry on two-dimensional bounds for F z = Σ Ni=1 I iz ↔ G z = Σ Mj=1 S jz polarization transfer in IS and I 2S spin- {1}/{2} systems is examined in detail. One result is that the maximum achievable F z ↔ G z polarization transfer is not reduced by permutation symmetry among the spins. For I 2S spin systems, S3 symmetry in the unitary propagator is shown to significantly reduce the accessible region in the 2D F z-S z Liouville subspace compared to the case restricted by unitarity alone. That result is compared with transformations under symmetric dipolar and scalar J coupling as well as shift and RF interactions. An important practical implication is that the refined spin thermodynamic theory of Levitt, Suter, and Ernst ( J. Chem. Phys.84, 4243, 1986) for cross polarization in solid-state NMR does not predict experimental outcomes incompatible with constraints of unitarity and spin-permutation symmetry.

  18. A hybrid quantum-inspired genetic algorithm for multiobjective flow shop scheduling.

    PubMed

    Li, Bin-Bin; Wang, Ling

    2007-06-01

    This paper proposes a hybrid quantum-inspired genetic algorithm (HQGA) for the multiobjective flow shop scheduling problem (FSSP), which is a typical NP-hard combinatorial optimization problem with strong engineering backgrounds. On the one hand, a quantum-inspired GA (QGA) based on Q-bit representation is applied for exploration in the discrete 0-1 hyperspace by using the updating operator of quantum gate and genetic operators of Q-bit. Moreover, random-key representation is used to convert the Q-bit representation to job permutation for evaluating the objective values of the schedule solution. On the other hand, permutation-based GA (PGA) is applied for both performing exploration in permutation-based scheduling space and stressing exploitation for good schedule solutions. To evaluate solutions in multiobjective sense, a randomly weighted linear-sum function is used in QGA, and a nondominated sorting technique including classification of Pareto fronts and fitness assignment is applied in PGA with regard to both proximity and diversity of solutions. To maintain the diversity of the population, two trimming techniques for population are proposed. The proposed HQGA is tested based on some multiobjective FSSPs. Simulation results and comparisons based on several performance metrics demonstrate the effectiveness of the proposed HQGA.

  19. Spatio-temporal pattern analysis for evaluation of the spread of human infections with avian influenza A(H7N9) virus in China, 2013-2014.

    PubMed

    Dong, Wen; Yang, Kun; Xu, Quanli; Liu, Lin; Chen, Juan

    2017-10-24

    A large number (n = 460) of A(H7N9) human infections have been reported in China from March 2013 through December 2014, and H7N9 outbreaks in humans became an emerging issue for China health, which have caused numerous disease outbreaks in domestic poultry and wild bird populations, and threatened human health severely. The aims of this study were to investigate the directional trend of the epidemic and to identify the significant presence of spatial-temporal clustering of influenza A(H7N9) human cases between March 2013 and December 2014. Three distinct epidemic phases of A(H7N9) human infections were identified in this study. In each phase, standard deviational ellipse analysis was conducted to examine the directional trend of disease spreading, and retrospective space-time permutation scan statistic was then used to identify the spatio-temporal cluster patterns of H7N9 outbreaks in humans. The ever-changing location and the increasing size of the three identified standard deviational ellipses showed that the epidemic moved from east to southeast coast, and hence to some central regions, with a future epidemiological trend of continue dispersing to more central regions of China, and a few new human cases might also appear in parts of the western China. Furthermore, A(H7N9) human infections were clustering in space and time in the first two phases with five significant spatio-temporal clusters (p < 0.05), but there was no significant cluster identified in phase III. There was a new epidemiologic pattern that the decrease in significant spatio-temporal cluster of A(H7N9) human infections was accompanied with an obvious spatial expansion of the outbreaks during the study period, and identification of the spatio-temporal patterns of the epidemic can provide valuable insights for better understanding the spreading dynamics of the disease in China.

  20. Spatial autocorrelation in growth of undisturbed natural pine stands across Georgia

    Treesearch

    Raymond L. Czaplewski; Robin M. Reich; William A. Bechtold

    1994-01-01

    Moran's I statistic measures the spatial autocorrelation in a random variable measured at discrete locations in space. Permutation procedures test the null hypothesis that the observed Moran's I value is no greater than that expected by chance. The spatial autocorrelation of gross basal area increment is analyzed for undisturbed, naturally regenerated stands...

  1. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  2. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  3. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  4. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Polarization-resolved time-delay signatures of chaos induced by FBG-feedback in VCSEL.

    PubMed

    Zhong, Zhu-Qiang; Li, Song-Sui; Chan, Sze-Chun; Xia, Guang-Qiong; Wu, Zheng-Mao

    2015-06-15

    Polarization-resolved chaotic emission intensities from a vertical-cavity surface-emitting laser (VCSEL) subject to feedback from a fiber Bragg grating (FBG) are numerically investigated. Time-delay (TD) signatures of the feedback are examined through various means including self-correlations of intensity time-series of individual polarizations, cross-correlation of intensities time-series between both polarizations, and permutation entropies calculated for the individual polarizations. The results show that the TD signatures can be clearly suppressed by selecting suitable operation parameters such as the feedback strength, FBG bandwidth, and Bragg frequency. Also, in the operational parameter space, numerical maps of TD signatures and effective bandwidths are obtained, which show regions of chaotic signals with both wide bandwidths and weak TD signatures. Finally, by comparing with a VCSEL subject to feedback from a mirror, the VCSEL subject to feedback from the FBG generally shows better concealment of the TD signatures with similar, or even wider, bandwidths.

  6. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  7. Decryption of pure-position permutation algorithms.

    PubMed

    Zhao, Xiao-Yu; Chen, Gang; Zhang, Dan; Wang, Xiao-Hong; Dong, Guang-Chang

    2004-07-01

    Pure position permutation image encryption algorithms, commonly used as image encryption investigated in this work are unfortunately frail under known-text attack. In view of the weakness of pure position permutation algorithm, we put forward an effective decryption algorithm for all pure-position permutation algorithms. First, a summary of the pure position permutation image encryption algorithms is given by introducing the concept of ergodic matrices. Then, by using probability theory and algebraic principles, the decryption probability of pure-position permutation algorithms is verified theoretically; and then, by defining the operation system of fuzzy ergodic matrices, we improve a specific decryption algorithm. Finally, some simulation results are shown.

  8. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  9. PERMutation Using Transposase Engineering (PERMUTE): A Simple Approach for Constructing Circularly Permuted Protein Libraries.

    PubMed

    Jones, Alicia M; Atkinson, Joshua T; Silberg, Jonathan J

    2017-01-01

    Rearrangements that alter the order of a protein's sequence are used in the lab to study protein folding, improve activity, and build molecular switches. One of the simplest ways to rearrange a protein sequence is through random circular permutation, where native protein termini are linked together and new termini are created elsewhere through random backbone fission. Transposase mutagenesis has emerged as a simple way to generate libraries encoding different circularly permuted variants of proteins. With this approach, a synthetic transposon (called a permuteposon) is randomly inserted throughout a circularized gene to generate vectors that express different permuted variants of a protein. In this chapter, we outline the protocol for constructing combinatorial libraries of circularly permuted proteins using transposase mutagenesis, and we describe the different permuteposons that have been developed to facilitate library construction.

  10. Analysis of crude oil markets with improved multiscale weighted permutation entropy

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Liu, Cheng

    2018-03-01

    Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.

  11. A space efficient flexible pivot selection approach to evaluate determinant and inverse of a matrix.

    PubMed

    Jafree, Hafsa Athar; Imtiaz, Muhammad; Inayatullah, Syed; Khan, Fozia Hanif; Nizami, Tajuddin

    2014-01-01

    This paper presents new simple approaches for evaluating determinant and inverse of a matrix. The choice of pivot selection has been kept arbitrary thus they reduce the error while solving an ill conditioned system. Computation of determinant of a matrix has been made more efficient by saving unnecessary data storage and also by reducing the order of the matrix at each iteration, while dictionary notation [1] has been incorporated for computing the matrix inverse thereby saving unnecessary calculations. These algorithms are highly class room oriented, easy to use and implemented by students. By taking the advantage of flexibility in pivot selection, one may easily avoid development of the fractions by most. Unlike the matrix inversion method [2] and [3], the presented algorithms obviate the use of permutations and inverse permutations.

  12. Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.

    PubMed

    Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio

    2018-02-21

    Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.

  13. SiGN-SSM: open source parallel software for estimating gene networks with state space models.

    PubMed

    Tamada, Yoshinori; Yamaguchi, Rui; Imoto, Seiya; Hirose, Osamu; Yoshida, Ryo; Nagasaki, Masao; Miyano, Satoru

    2011-04-15

    SiGN-SSM is an open-source gene network estimation software able to run in parallel on PCs and massively parallel supercomputers. The software estimates a state space model (SSM), that is a statistical dynamic model suitable for analyzing short time and/or replicated time series gene expression profiles. SiGN-SSM implements a novel parameter constraint effective to stabilize the estimated models. Also, by using a supercomputer, it is able to determine the gene network structure by a statistical permutation test in a practical time. SiGN-SSM is applicable not only to analyzing temporal regulatory dependencies between genes, but also to extracting the differentially regulated genes from time series expression profiles. SiGN-SSM is distributed under GNU Affero General Public Licence (GNU AGPL) version 3 and can be downloaded at http://sign.hgc.jp/signssm/. The pre-compiled binaries for some architectures are available in addition to the source code. The pre-installed binaries are also available on the Human Genome Center supercomputer system. The online manual and the supplementary information of SiGN-SSM is available on our web site. tamada@ims.u-tokyo.ac.jp.

  14. Four applications of permutation methods to testing a single-mediator model.

    PubMed

    Taylor, Aaron B; MacKinnon, David P

    2012-09-01

    Four applications of permutation tests to the single-mediator model are described and evaluated in this study. Permutation tests work by rearranging data in many possible ways in order to estimate the sampling distribution for the test statistic. The four applications to mediation evaluated here are the permutation test of ab, the permutation joint significance test, and the noniterative and iterative permutation confidence intervals for ab. A Monte Carlo simulation study was used to compare these four tests with the four best available tests for mediation found in previous research: the joint significance test, the distribution of the product test, and the percentile and bias-corrected bootstrap tests. We compared the different methods on Type I error, power, and confidence interval coverage. The noniterative permutation confidence interval for ab was the best performer among the new methods. It successfully controlled Type I error, had power nearly as good as the most powerful existing methods, and had better coverage than any existing method. The iterative permutation confidence interval for ab had lower power than do some existing methods, but it performed better than any other method in terms of coverage. The permutation confidence interval methods are recommended when estimating a confidence interval is a primary concern. SPSS and SAS macros that estimate these confidence intervals are provided.

  15. [Space-time suicide clustering in the community of Antequera (Spain)].

    PubMed

    Pérez-Costillas, Lucía; Blasco-Fontecilla, Hilario; Benítez, Nicolás; Comino, Raquel; Antón, José Miguel; Ramos-Medina, Valentín; Lopez, Amalia; Palomo, José Luis; Madrigal, Lucía; Alcalde, Javier; Perea-Millá, Emilio; Artieda-Urrutia, Paula; de León-Martínez, Victoria; de Diego Otero, Yolanda

    2015-01-01

    Approximately 3,500 people commit suicide every year in Spain. The main aim of this study is to explore if a spatial and temporal clustering of suicide exists in the region of Antequera (Málaga, España). Sample and procedure: All suicides from January 1, 2004 to December 31, 2008 were identified using data from the Forensic Pathology Department of the Institute of Legal Medicine, Málaga (España). Geolocalisation. Google Earth was used to calculate the coordinates for each suicide decedent's address. Statistical analysis. A spatiotemporal permutation scan statistic and the Ripley's K function were used to explore spatiotemporal clustering. Pearson's chi-squared was used to determine whether there were differences between suicides inside and outside the spatiotemporal clusters. A total of 120 individuals committed suicide within the region of Antequera, of which 96 (80%) were included in our analyses. Statistically significant evidence for 7 spatiotemporal suicide clusters emerged within critical limits for the 0-2.5 km distance and for the first and second semanas (P<.05 in both cases) after suicide. There was not a single subject diagnosed with a current psychotic disorder, among suicides within clusters, whereas outside clusters, 20% had this diagnosis (X2=4.13; df=1; P<.05). There are spatiotemporal suicide clusters in the area surrounding Antequera. Patients diagnosed with current psychotic disorder are less likely to be influenced by the factors explaining suicide clustering. Copyright © 2013 SEP y SEPB. Published by Elsevier España. All rights reserved.

  16. A simplified formalism of the algebra of partially transposed permutation operators with applications

    NASA Astrophysics Data System (ADS)

    Mozrzymas, Marek; Studziński, Michał; Horodecki, Michał

    2018-03-01

    Herein we continue the study of the representation theory of the algebra of permutation operators acting on the n -fold tensor product space, partially transposed on the last subsystem. We develop the concept of partially reduced irreducible representations, which allows us to significantly simplify previously proved theorems and, most importantly, derive new results for irreducible representations of the mentioned algebra. In our analysis we are able to reduce the complexity of the central expressions by getting rid of sums over all permutations from the symmetric group, obtaining equations which are much more handy in practical applications. We also find relatively simple matrix representations for the generators of the underlying algebra. The obtained simplifications and developments are applied to derive the characteristics of a deterministic port-based teleportation scheme written purely in terms of irreducible representations of the studied algebra. We solve an eigenproblem for the generators of the algebra, which is the first step towards a hybrid port-based teleportation scheme and gives us new proofs of the asymptotic behaviour of teleportation fidelity. We also show a connection between the density operator characterising port-based teleportation and a particular matrix composed of an irreducible representation of the symmetric group, which encodes properties of the investigated algebra.

  17. The structure of a thermophilic kinase shapes fitness upon random circular permutation

    PubMed Central

    Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.

    2016-01-01

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658

  18. The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.

    PubMed

    Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J

    2016-05-20

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.

  19. Teaching Tip: When a Matrix and Its Inverse Are Stochastic

    ERIC Educational Resources Information Center

    Ding, J.; Rhee, N. H.

    2013-01-01

    A stochastic matrix is a square matrix with nonnegative entries and row sums 1. The simplest example is a permutation matrix, whose rows permute the rows of an identity matrix. A permutation matrix and its inverse are both stochastic. We prove the converse, that is, if a matrix and its inverse are both stochastic, then it is a permutation matrix.

  20. Permutation-based inference for the AUC: A unified approach for continuous and discontinuous data.

    PubMed

    Pauly, Markus; Asendorf, Thomas; Konietschke, Frank

    2016-11-01

    We investigate rank-based studentized permutation methods for the nonparametric Behrens-Fisher problem, that is, inference methods for the area under the ROC curve. We hereby prove that the studentized permutation distribution of the Brunner-Munzel rank statistic is asymptotically standard normal, even under the alternative. Thus, incidentally providing the hitherto missing theoretical foundation for the Neubert and Brunner studentized permutation test. In particular, we do not only show its consistency, but also that confidence intervals for the underlying treatment effects can be computed by inverting this permutation test. In addition, we derive permutation-based range-preserving confidence intervals. Extensive simulation studies show that the permutation-based confidence intervals appear to maintain the preassigned coverage probability quite accurately (even for rather small sample sizes). For a convenient application of the proposed methods, a freely available software package for the statistical software R has been developed. A real data example illustrates the application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  2. Optimization and experimental realization of the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Yalçınkaya, I.; Gedik, Z.

    2017-12-01

    The quantum permutation algorithm provides computational speed-up over classical algorithms for determining the parity of a given cyclic permutation. For its n -qubit implementations, the number of required quantum gates scales quadratically with n due to the quantum Fourier transforms included. We show here for the n -qubit case that the algorithm can be simplified so that it requires only O (n ) quantum gates, which theoretically reduces the complexity of the implementation. To test our results experimentally, we utilize IBM's 5-qubit quantum processor to realize the algorithm by using the original and simplified recipes for the 2-qubit case. It turns out that the latter results in a significantly higher success probability which allows us to verify the algorithm more precisely than the previous experimental realizations. We also verify the algorithm for the first time for the 3-qubit case with a considerable success probability by taking the advantage of our simplified scheme.

  3. A ripple-spreading genetic algorithm for the aircraft sequencing problem.

    PubMed

    Hu, Xiao-Bing; Di Paolo, Ezequiel A

    2011-01-01

    When genetic algorithms (GAs) are applied to combinatorial problems, permutation representations are usually adopted. As a result, such GAs are often confronted with feasibility and memory-efficiency problems. With the aircraft sequencing problem (ASP) as a study case, this paper reports on a novel binary-representation-based GA scheme for combinatorial problems. Unlike existing GAs for the ASP, which typically use permutation representations based on aircraft landing order, the new GA introduces a novel ripple-spreading model which transforms the original landing-order-based ASP solutions into value-based ones. In the new scheme, arriving aircraft are projected as points into an artificial space. A deterministic method inspired by the natural phenomenon of ripple-spreading on liquid surfaces is developed, which uses a few parameters as input to connect points on this space to form a landing sequence. A traditional GA, free of feasibility and memory-efficiency problems, can then be used to evolve the ripple-spreading related parameters in order to find an optimal sequence. Since the ripple-spreading model is the centerpiece of the new algorithm, it is called the ripple-spreading GA (RSGA). The advantages of the proposed RSGA are illustrated by extensive comparative studies for the case of the ASP.

  4. Voronoi distance based prospective space-time scans for point data sets: a dengue fever cluster analysis in a southeast Brazilian town

    PubMed Central

    2011-01-01

    Background The Prospective Space-Time scan statistic (PST) is widely used for the evaluation of space-time clusters of point event data. Usually a window of cylindrical shape is employed, with a circular or elliptical base in the space domain. Recently, the concept of Minimum Spanning Tree (MST) was applied to specify the set of potential clusters, through the Density-Equalizing Euclidean MST (DEEMST) method, for the detection of arbitrarily shaped clusters. The original map is cartogram transformed, such that the control points are spread uniformly. That method is quite effective, but the cartogram construction is computationally expensive and complicated. Results A fast method for the detection and inference of point data set space-time disease clusters is presented, the Voronoi Based Scan (VBScan). A Voronoi diagram is built for points representing population individuals (cases and controls). The number of Voronoi cells boundaries intercepted by the line segment joining two cases points defines the Voronoi distance between those points. That distance is used to approximate the density of the heterogeneous population and build the Voronoi distance MST linking the cases. The successive removal of edges from the Voronoi distance MST generates sub-trees which are the potential space-time clusters. Finally, those clusters are evaluated through the scan statistic. Monte Carlo replications of the original data are used to evaluate the significance of the clusters. An application for dengue fever in a small Brazilian city is presented. Conclusions The ability to promptly detect space-time clusters of disease outbreaks, when the number of individuals is large, was shown to be feasible, due to the reduced computational load of VBScan. Instead of changing the map, VBScan modifies the metric used to define the distance between cases, without requiring the cartogram construction. Numerical simulations showed that VBScan has higher power of detection, sensitivity and positive predicted value than the Elliptic PST. Furthermore, as VBScan also incorporates topological information from the point neighborhood structure, in addition to the usual geometric information, it is more robust than purely geometric methods such as the elliptic scan. Those advantages were illustrated in a real setting for dengue fever space-time clusters. PMID:21513556

  5. Companion animal disease surveillance: a new solution to an old problem?

    PubMed

    Ward, M P; Kelman, M

    2011-09-01

    Infectious disease surveillance in companion animals has a long history. However, it has mostly taken the form of ad hoc surveys, or has focused on adverse reactions to pharmaceuticals. In 2006 a Blue Ribbon Panel was convened by the U.S. White House Office of Science and Technology Policy to discuss the potential utility of a national companion animal health surveillance system. Such a system could provide fundamental information about disease occurrence, transmission and risk factors; and could facilitate industry-supported pharmaco-epidemiological studies and post-market surveillance. Disease WatchDog, a prospective national disease surveillance project, was officially launched in January 2010 to capture data on diseases in dogs and cats throughout Australia. Participation is encouraged by providing registrants real-time disease maps and material for improved communication between veterinarians and clients. From January to mid-November 2010, an estimated 31% of veterinary clinics Australia-wide joined the project. Over 1300 disease cases - including Canine Parvovirus (CPV), Canine Distemper, Canine Hepatitis, Feline Calicivirus, Feline Herpesvirus, and Tick Paralysis - were reported. In New South Wales alone, 552 CPV cases in dogs were reported from 89 postcode locations. New South Wales data was scanned using the space-time permutation test. Up to 24 clusters (P<0.01) were identified, occurring in all months except March. The greatest number of clusters (n=6) were identified in April. The most likely cluster was identified in western Sydney, where 36 cases of CPV were reported from a postcode in February. Although the project is still in its infancy, already new information on disease distribution has been produced. Disease information generated could facilitate targeted control and prevention programs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Molecular symmetry: Why permutation-inversion (PI) groups don't render the point groups obsolete

    NASA Astrophysics Data System (ADS)

    Groner, Peter

    2018-01-01

    The analysis of spectra of molecules with internal large-amplitude motions (LAMs) requires molecular symmetry (MS) groups that are larger than and significantly different from the more familiar point groups. MS groups are described often by the permutation-inversion (PI) group method. It is shown that point groups still can and should play a significant role together with the PI groups for a class of molecules with internal rotors. In molecules of this class, several simple internal rotors are attached to a rigid molecular frame. The PI groups for this class are semidirect products like H ^ F, where the invariant subgroup H is a direct product of cyclic groups and F is a point group. This result is used to derive meaningful labels for MS groups, and to derive correlation tables between MS groups and point groups. MS groups of this class have many parallels to space groups of crystalline solids.

  7. Chaotic reconfigurable ZCMT precoder for OFDM data encryption and PAPR reduction

    NASA Astrophysics Data System (ADS)

    Chen, Han; Yang, Xuelin; Hu, Weisheng

    2017-12-01

    A secure orthogonal frequency division multiplexing (OFDM) transmission scheme precoded by chaotic Zadoff-Chu matrix transform (ZCMT) is proposed and demonstrated. It is proved that the reconfigurable ZCMT matrices after row/column permutations can be applied as an alternative precoder for peak-to-average power ratio (PAPR) reduction. The permutations and the reconfigurable parameters in ZCMT matrix are generated by a hyper digital chaos, in which a huge key space of ∼ 10800 is created for physical-layer OFDM data encryption. An encrypted data transmission of 8.9 Gb/s optical OFDM signals is successfully demonstrated over 20 km standard single-mode fiber (SSMF) for 16-QAM. The BER performance of the encrypted signals is improved by ∼ 2 dB (BER@ 10-3), which is mainly attributed to the effective reduction of PAPR via chaotic ZCMT precoding. Moreover, the chaotic ZCMT precoding scheme requires no sideband information, thus the spectrum efficiency is enhanced during transmission.

  8. A Comparison of Techniques for Scheduling Fleets of Earth-Observing Satellites

    NASA Technical Reports Server (NTRS)

    Globus, Al; Crawford, James; Lohn, Jason; Pryor, Anna

    2003-01-01

    Earth observing satellite (EOS) scheduling is a complex real-world domain representative of a broad class of over-subscription scheduling problems. Over-subscription problems are those where requests for a facility exceed its capacity. These problems arise in a wide variety of NASA and terrestrial domains and are .XI important class of scheduling problems because such facilities often represent large capital investments. We have run experiments comparing multiple variants of the genetic algorithm, hill climbing, simulated annealing, squeaky wheel optimization and iterated sampling on two variants of a realistically-sized model of the EOS scheduling problem. These are implemented as permutation-based methods; methods that search in the space of priority orderings of observation requests and evaluate each permutation by using it to drive a greedy scheduler. Simulated annealing performs best and random mutation operators outperform our squeaky (more intelligent) operator. Furthermore, taking smaller steps towards the end of the search improves performance.

  9. Entanglement distillation protocols and number theory

    NASA Astrophysics Data System (ADS)

    Bombin, H.; Martin-Delgado, M. A.

    2005-09-01

    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension D benefits from applying basic concepts from number theory, since the set ZDn associated with Bell diagonal states is a module rather than a vector space. We find that a partition of ZDn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analytically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension D . When D is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.

  10. An AUC-based permutation variable importance measure for random forests

    PubMed Central

    2013-01-01

    Background The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. Results We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. Conclusions The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html. PMID:23560875

  11. An AUC-based permutation variable importance measure for random forests.

    PubMed

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  12. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  13. Time Distribution Using SpaceWire in the SCaN Testbed on ISS

    NASA Technical Reports Server (NTRS)

    Lux, James P.

    2012-01-01

    A paper describes an approach for timekeeping and time transfer among the devices on the CoNNeCT project s SCaN Testbed. It also describes how the clocks may be synchronized with an external time reference; e.g., time tags from the International Space Station (ISS) or RF signals received by a radio (TDRSS time service or GPS). All the units have some sort of counter that is fed by an oscillator at some convenient frequency. The basic problem in timekeeping is relating the counter value to some external time standard such as UTC. With SpaceWire, there are two approaches possible: one is to just use SpaceWire to send a message, and use an external wire for the sync signal. This is much the same as with the RS- 232 messages and l pps line from a GPS receiver. However, SpaceWire has an additional capability that was added to make it easier - it can insert and receive a special "timecode" word in the data stream.

  14. Quantum-Secret-Sharing Scheme Based on Local Distinguishability of Orthogonal Seven-Qudit Entangled States

    NASA Astrophysics Data System (ADS)

    Liu, Cheng-Ji; Li, Zhi-Hui; Bai, Chen-Ming; Si, Meng-Meng

    2018-02-01

    The concept of judgment space was proposed by Wang et al. (Phys. Rev. A 95, 022320, 2017), which was used to study some important properties of quantum entangled states based on local distinguishability. In this study, we construct 15 kinds of seven-qudit quantum entangled states in the sense of permutation, calculate their judgment space and propose a distinguishability rule to make the judgment space more clearly. Based on this rule, we study the local distinguishability of the 15 kinds of seven-qudit quantum entangled states and then propose a ( k, n) threshold quantum secret sharing scheme. Finally, we analyze the security of the scheme.

  15. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2008-06-24

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  16. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S; Cabantous, Stephanie

    2013-02-12

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  17. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2011-06-14

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  18. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S.; Cabantous, Stephanie

    2013-04-16

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  19. Deep Space Wide Area Search Strategies

    NASA Astrophysics Data System (ADS)

    Capps, M.; McCafferty, J.

    There is an urgent need to expand the space situational awareness (SSA) mission beyond catalog maintenance to providing near real-time indications and warnings of emerging events. While building and maintaining a catalog of space objects is essential to SSA, this does not address the threat of uncatalogued and uncorrelated deep space objects. The Air Force therefore has an interest in transformative technologies to scan the geostationary (GEO) belt for uncorrelated space objects. Traditional ground based electro-optical sensors are challenged in simultaneously detecting dim objects while covering large areas of the sky using current CCD technology. Time delayed integration (TDI) scanning has the potential to enable significantly larger coverage rates while maintaining sensitivity for detecting near-GEO objects. This paper investigates strategies of employing TDI sensing technology from a ground based electro-optical telescope, toward providing tactical indications and warnings of deep space threats. We present results of a notional wide area search TDI sensor that scans the GEO belt from three locations: Maui, New Mexico, and Diego Garcia. Deep space objects in the NASA 2030 debris catalog are propagated over multiple nights as an indicative data set to emulate notional uncatalogued near-GEO orbits which may be encountered by the TDI sensor. Multiple scan patterns are designed and simulated, to compare and contrast performance based on 1) efficiency in coverage, 2) number of objects detected, and 3) rate at which detections occur, to enable follow-up observations by other space surveillance network (SSN) sensors. A step-stare approach is also modeled using a dedicated, co-located sensor notionally similar to the Ground-Based Electro-Optical Deep Space Surveillance (GEODSS) tower. Equivalent sensitivities are assumed. This analysis quantifies the relative benefit of TDI scanning for the wide area search mission.

  20. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    PubMed Central

    Kuai, Moshen; Cheng, Gang; Li, Yong

    2018-01-01

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively. PMID:29510569

  1. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS.

    PubMed

    Kuai, Moshen; Cheng, Gang; Pang, Yusong; Li, Yong

    2018-03-05

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  2. Deactivation of Zeolite Catalyst H-ZSM-5 during Conversion of Methanol to Gasoline: Operando Time- and Space-Resolved X-ray Diffraction.

    PubMed

    Rojo-Gama, Daniel; Mentel, Lukasz; Kalantzopoulos, Georgios N; Pappas, Dimitrios K; Dovgaliuk, Iurii; Olsbye, Unni; Lillerud, Karl Petter; Beato, Pablo; Lundegaard, Lars F; Wragg, David S; Svelle, Stian

    2018-03-15

    The deactivation of zeolite catalyst H-ZSM-5 by coking during the conversion of methanol to hydrocarbons was monitored by high-energy space- and time-resolved operando X-ray diffraction (XRD) . Space resolution was achieved by continuous scanning along the axial length of a capillary fixed bed reactor with a time resolution of 10 s per scan. Using real structural parameters obtained from XRD, we can track the development of coke at different points in the reactor and link this to a kinetic model to correlate catalyst deactivation with structural changes occurring in the material. The "burning cigar" model of catalyst bed deactivation is directly observed in real time.

  3. Counting conformal correlators

    NASA Astrophysics Data System (ADS)

    Kravchuk, Petr; Simmons-Duffin, David

    2018-02-01

    We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.

  4. Permutation flow-shop scheduling problem to optimize a quadratic objective function

    NASA Astrophysics Data System (ADS)

    Ren, Tao; Zhao, Peng; Zhang, Da; Liu, Bingqian; Yuan, Huawei; Bai, Danyu

    2017-09-01

    A flow-shop scheduling model enables appropriate sequencing for each job and for processing on a set of machines in compliance with identical processing orders. The objective is to achieve a feasible schedule for optimizing a given criterion. Permutation is a special setting of the model in which the processing order of the jobs on the machines is identical for each subsequent step of processing. This article addresses the permutation flow-shop scheduling problem to minimize the criterion of total weighted quadratic completion time. With a probability hypothesis, the asymptotic optimality of the weighted shortest processing time schedule under a consistency condition (WSPT-CC) is proven for sufficiently large-scale problems. However, the worst case performance ratio of the WSPT-CC schedule is the square of the number of machines in certain situations. A discrete differential evolution algorithm, where a new crossover method with multiple-point insertion is used to improve the final outcome, is presented to obtain high-quality solutions for moderate-scale problems. A sequence-independent lower bound is designed for pruning in a branch-and-bound algorithm for small-scale problems. A set of random experiments demonstrates the performance of the lower bound and the effectiveness of the proposed algorithms.

  5. SO(4) algebraic approach to the three-body bound state problem in two dimensions

    NASA Astrophysics Data System (ADS)

    Dmitrašinović, V.; Salom, Igor

    2014-08-01

    We use the permutation symmetric hyperspherical three-body variables to cast the non-relativistic three-body Schrödinger equation in two dimensions into a set of (possibly decoupled) differential equations that define an eigenvalue problem for the hyper-radial wave function depending on an SO(4) hyper-angular matrix element. We express this hyper-angular matrix element in terms of SO(3) group Clebsch-Gordan coefficients and use the latter's properties to derive selection rules for potentials with different dynamical/permutation symmetries. Three-body potentials acting on three identical particles may have different dynamical symmetries, in order of increasing symmetry, as follows: (1) S3 ⊗ OL(2), the permutation times rotational symmetry, that holds in sums of pairwise potentials, (2) O(2) ⊗ OL(2), the so-called "kinematic rotations" or "democracy symmetry" times rotational symmetry, that holds in area-dependent potentials, and (3) O(4) dynamical hyper-angular symmetry, that holds in hyper-radial three-body potentials. We show how the different residual dynamical symmetries of the non-relativistic three-body Hamiltonian lead to different degeneracies of certain states within O(4) multiplets.

  6. Global Snow from Space: Development of a Satellite-based, Terrestrial Snow Mission Planning Tool

    NASA Astrophysics Data System (ADS)

    Forman, B. A.; Kumar, S.; LeMoigne, J.; Nag, S.

    2017-12-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary - or perhaps contradictory - information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  7. Margalef's mandala and phytoplankton bloom strategies

    NASA Astrophysics Data System (ADS)

    Wyatt, Timothy

    2014-03-01

    Margalef's mandala maps phytoplankton species into a phase space defined by turbulence (A) and nutrient concentrations (Ni); these are the hard axes. The permutations of high and low A and high and low Ni divide the space into four domains. Soft axes indicate some ecological dynamics. A main sequence shows the normal course of phytoplankton succession; the r-K axis of MacArthur and Wilson runs parallel to it. An alternative successional sequence leads to the low A-high Ni domain into which many red tide species are mapped. Astronomical and biological time are implicit. A mathematical transformation of the mandala (rotation) links it to the classical bloom models of Sverdrup (time) and Kierstead and Slobodkin (space). Both rarity and the propensity to form red tides are considered to be species characters, meaning that maximum population abundance can be a target of natural selection. Equally, both the unpredictable appearance of bloom species and their short-lived appearances may be species characters. There may be a correlation too between these features and long-lived dormant stages in the life-cycle; then the vegetative planktonic phase is the 'weak link' in the life-cycle. Red tides are thus due to species which have evolved suites of traits which result in specific demographic strategies.

  8. The New Weather Radar for America's Space Program in Florida: A Temperature Profile Adaptive Scan Strategy

    NASA Technical Reports Server (NTRS)

    Carey, L. D.; Petersen, W. A.; Deierling, W.; Roeder, W. P.

    2009-01-01

    A new weather radar is being acquired for use in support of America s space program at Cape Canaveral Air Force Station, NASA Kennedy Space Center, and Patrick AFB on the east coast of central Florida. This new radar replaces the modified WSR-74C at Patrick AFB that has been in use since 1984. The new radar is a Radtec TDR 43-250, which has Doppler and dual polarization capability. A new fixed scan strategy was designed to best support the space program. The fixed scan strategy represents a complex compromise between many competing factors and relies on climatological heights of various temperatures that are important for improved lightning forecasting and evaluation of Lightning Launch Commit Criteria (LCC), which are the weather rules to avoid lightning strikes to in-flight rockets. The 0 C to -20 C layer is vital since most generation of electric charge occurs within it and so it is critical in evaluating Lightning LCC and in forecasting lightning. These are two of the most important duties of 45 WS. While the fixed scan strategy that covers most of the climatological variation of the 0 C to -20 C levels with high resolution ensures that these critical temperatures are well covered most of the time, it also means that on any particular day the radar is spending precious time scanning at angles covering less important heights. The goal of this project is to develop a user-friendly, Interactive Data Language (IDL) computer program that will automatically generate optimized radar scan strategies that adapt to user input of the temperature profile and other important parameters. By using only the required scan angles output by the temperature profile adaptive scan strategy program, faster update times for volume scans and/or collection of more samples per gate for better data quality is possible, while maintaining high resolution at the critical temperature levels. The temperature profile adaptive technique will also take into account earth curvature and refraction when geo-locating the radar beam (i.e., beam height and arc distance), including non-standard refraction based on the user-input temperature profile. In addition to temperature profile adaptivity, this paper will also summarize the other requirements for this scan strategy program such as detection of low-level boundaries, detection of anvil clouds, reducing the Cone Of Silence, and allowing for times when deep convective clouds will not occur. The adaptive technique will be carefully compared to and benchmarked against the new fixed scan strategy. Specific environmental scenarios in which the adaptive scan strategy is able to optimize and improve coverage and resolution at critical heights, scan time, and/or sample numbers relative to the fixed scan strategy will be presented.

  9. Estimating times of surgeries with two component procedures: comparison of the lognormal and normal models.

    PubMed

    Strum, David P; May, Jerrold H; Sampson, Allan R; Vargas, Luis G; Spangler, William E

    2003-01-01

    Variability inherent in the duration of surgical procedures complicates surgical scheduling. Modeling the duration and variability of surgeries might improve time estimates. Accurate time estimates are important operationally to improve utilization, reduce costs, and identify surgeries that might be considered outliers. Surgeries with multiple procedures are difficult to model because they are difficult to segment into homogenous groups and because they are performed less frequently than single-procedure surgeries. The authors studied, retrospectively, 10,740 surgeries each with exactly two CPTs and 46,322 surgical cases with only one CPT from a large teaching hospital to determine if the distribution of dual-procedure surgery times fit more closely a lognormal or a normal model. The authors tested model goodness of fit to their data using Shapiro-Wilk tests, studied factors affecting the variability of time estimates, and examined the impact of coding permutations (ordered combinations) on modeling. The Shapiro-Wilk tests indicated that the lognormal model is statistically superior to the normal model for modeling dual-procedure surgeries. Permutations of component codes did not appear to differ significantly with respect to total procedure time and surgical time. To improve individual models for infrequent dual-procedure surgeries, permutations may be reduced and estimates may be based on the longest component procedure and type of anesthesia. The authors recommend use of the lognormal model for estimating surgical times for surgeries with two component procedures. Their results help legitimize the use of log transforms to normalize surgical procedure times prior to hypothesis testing using linear statistical models. Multiple-procedure surgeries may be modeled using the longest (statistically most important) component procedure and type of anesthesia.

  10. Neighbourhood generation mechanism applied in simulated annealing to job shop scheduling problems

    NASA Astrophysics Data System (ADS)

    Cruz-Chávez, Marco Antonio

    2015-11-01

    This paper presents a neighbourhood generation mechanism for the job shop scheduling problems (JSSPs). In order to obtain a feasible neighbour with the generation mechanism, it is only necessary to generate a permutation of an adjacent pair of operations in a scheduling of the JSSP. If there is no slack time between the adjacent pair of operations that is permuted, then it is proven, through theory and experimentation, that the new neighbour (schedule) generated is feasible. It is demonstrated that the neighbourhood generation mechanism is very efficient and effective in a simulated annealing.

  11. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  12. Automated matching of corresponding seed images of three simulator radiographs to allow 3D triangulation of implanted seeds.

    PubMed

    Altschuler, M D; Kassaee, A

    1997-02-01

    To match corresponding seed images in different radiographs so that the 3D seed locations can be triangulated automatically and without ambiguity requires (at least) three radiographs taken from different perspectives, and an algorithm that finds the proper permutations of the seed-image indices. Matching corresponding images in only two radiographs introduces inherent ambiguities which can be resolved only with the use of non-positional information obtained with intensive human effort. Matching images in three or more radiographs is an 'NP (Non-determinant in Polynomial time)-complete' problem. Although the matching problem is fundamental, current methods for three-radiograph seed-image matching use 'local' (seed-by-seed) methods that may lead to incorrect matchings. We describe a permutation-sampling method which not only gives good 'global' (full permutation) matches for the NP-complete three-radiograph seed-matching problem, but also determines the reliability of the radiographic data themselves, namely, whether the patient moved in the interval between radiographic perspectives.

  13. Automated matching of corresponding seed images of three simulator radiographs to allow 3D triangulation of implanted seeds

    NASA Astrophysics Data System (ADS)

    Altschuler, Martin D.; Kassaee, Alireza

    1997-02-01

    To match corresponding seed images in different radiographs so that the 3D seed locations can be triangulated automatically and without ambiguity requires (at least) three radiographs taken from different perspectives, and an algorithm that finds the proper permutations of the seed-image indices. Matching corresponding images in only two radiographs introduces inherent ambiguities which can be resolved only with the use of non-positional information obtained with intensive human effort. Matching images in three or more radiographs is an `NP (Non-determinant in Polynomial time)-complete' problem. Although the matching problem is fundamental, current methods for three-radiograph seed-image matching use `local' (seed-by-seed) methods that may lead to incorrect matchings. We describe a permutation-sampling method which not only gives good `global' (full permutation) matches for the NP-complete three-radiograph seed-matching problem, but also determines the reliability of the radiographic data themselves, namely, whether the patient moved in the interval between radiographic perspectives.

  14. Classifying epileptic EEG signals with delay permutation entropy and Multi-Scale K-means.

    PubMed

    Zhu, Guohun; Li, Yan; Wen, Peng Paul; Wang, Shuaifang

    2015-01-01

    Most epileptic EEG classification algorithms are supervised and require large training datasets, that hinder their use in real time applications. This chapter proposes an unsupervised Multi-Scale K-means (MSK-means) MSK-means algorithm to distinguish epileptic EEG signals and identify epileptic zones. The random initialization of the K-means algorithm can lead to wrong clusters. Based on the characteristics of EEGs, the MSK-means MSK-means algorithm initializes the coarse-scale centroid of a cluster with a suitable scale factor. In this chapter, the MSK-means algorithm is proved theoretically superior to the K-means algorithm on efficiency. In addition, three classifiers: the K-means, MSK-means MSK-means and support vector machine (SVM), are used to identify seizure and localize epileptogenic zone using delay permutation entropy features. The experimental results demonstrate that identifying seizure with the MSK-means algorithm and delay permutation entropy achieves 4. 7 % higher accuracy than that of K-means, and 0. 7 % higher accuracy than that of the SVM.

  15. A fast chaos-based image encryption scheme with a dynamic state variables selection mechanism

    NASA Astrophysics Data System (ADS)

    Chen, Jun-xin; Zhu, Zhi-liang; Fu, Chong; Yu, Hai; Zhang, Li-bo

    2015-03-01

    In recent years, a variety of chaos-based image cryptosystems have been investigated to meet the increasing demand for real-time secure image transmission. Most of them are based on permutation-diffusion architecture, in which permutation and diffusion are two independent procedures with fixed control parameters. This property results in two flaws. (1) At least two chaotic state variables are required for encrypting one plain pixel, in permutation and diffusion stages respectively. Chaotic state variables produced with high computation complexity are not sufficiently used. (2) The key stream solely depends on the secret key, and hence the cryptosystem is vulnerable against known/chosen-plaintext attacks. In this paper, a fast chaos-based image encryption scheme with a dynamic state variables selection mechanism is proposed to enhance the security and promote the efficiency of chaos-based image cryptosystems. Experimental simulations and extensive cryptanalysis have been carried out and the results prove the superior security and high efficiency of the scheme.

  16. Predication-based semantic indexing: permutations as a means to encode predications in semantic space.

    PubMed

    Cohen, Trevor; Schvaneveldt, Roger W; Rindflesch, Thomas C

    2009-11-14

    Corpus-derived distributional models of semantic distance between terms have proved useful in a number of applications. For both theoretical and practical reasons, it is desirable to extend these models to encode discrete concepts and the ways in which they are related to one another. In this paper, we present a novel vector space model that encodes semantic predications derived from MEDLINE by the SemRep system into a compact spatial representation. The associations captured by this method are of a different and complementary nature to those derived by traditional vector space models, and the encoding of predication types presents new possibilities for knowledge discovery and information retrieval.

  17. Evaluation of an accelerated 3D SPACE sequence with compressed sensing and free-stop scan mode for imaging of the knee.

    PubMed

    Henninger, B; Raithel, E; Kranewitter, C; Steurer, M; Jaschke, W; Kremser, C

    2018-05-01

    To prospectively evaluate a prototypical 3D turbo-spin-echo proton-density-weighted sequence with compressed sensing and free-stop scan mode for preventing motion artefacts (3D-PD-CS-SPACE free-stop) for knee imaging in a clinical setting. 80 patients underwent 3T magnetic resonance imaging (MRI) of the knee with our 2D routine protocol and with 3D-PD-CS-SPACE free-stop. In case of a scan-stop caused by motion (images are calculated nevertheless) the sequence was repeated without free-stop mode. All scans were evaluated by 2 radiologists concerning image quality of the 3D-PD-CS-SPACE (with and without free-stop). Important knee structures were further assessed in a lesion based analysis and compared to our reference 2D-PD-fs sequences. Image quality of the 3D-PD-CS-SPACE free-stop was found optimal in 47/80, slightly compromised in 21/80, moderately in 10/80 and severely in 2/80. In 29/80, the free-stop scan mode stopped the 3D-PD-CS-SPACE due to subject motion with a slight increase of image quality at longer effective acquisition times. Compared to the 3D-PD-CS-SPACE with free-stop, the image quality of the acquired 3D-PD-CS-SPACE without free-stop was found equal in 6/29, slightly improved in 13/29, improved with equal contours in 8/29, and improved with sharper contours in 2/29. The lesion based analysis showed a high agreement between the results from the 3D-PD-CS-SPACE free-stop and our 2D-PD-fs routine protocol (overall agreement 96.25%-100%, Cohen's Kappa 0.883-1, p < 0.001). 3D-PD-CS-SPACE free-stop is a reliable alternative for standard 2D-PD-fs protocols with acceptable acquisition times. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Overlap Cycles for Permutations: Necessary and Sufficient Conditions

    DTIC Science & Technology

    2013-09-19

    for Weak Orders, To appear in SIAM Journal of Discrete Math . [9] G. Hurlbert and G. Isaak, Equivalence class universal cycles for permutations, Discrete ... Math . 149 (1996), pp. 123–129. [10] J. R. Johnson, Universal cycles for permutations, Discrete Math . 309 (2009), pp. 5264– 5270. [11] E. A. Ragland

  19. Multi-response permutation procedure as an alternative to the analysis of variance: an SPSS implementation.

    PubMed

    Cai, Li

    2006-02-01

    A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.

  20. A Service Portal for the Integrated SCaN Network

    NASA Technical Reports Server (NTRS)

    Marx, Sarah R.

    2012-01-01

    The Space Communication and Navigation (SCaN) program office owns the assets and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and Space Network (SN). At present, these individual networks are operated by different NASA centers--JPL for DSN--and Goddard Space Flight Center (GSFC) for NEN and SN--with separate commitments offices for each center. In the near future, SCaN's program office would like to deploy an integrated service portal which would merge the two commitments offices with the goal of easing the task of user planning for space missions requiring services of two or more of these networks. Following interviews with subject matter experts in this field, use cases were created to include the services and functionality mission users would like to see in this new integrated service portal. These use cases provide a guideline for a mock-up of the design of the user interface for the portal. The benefit of this work will ease the time required and streamline/standardize the process for planning and scheduling SCAN's services for future space missions.

  1. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  2. Terrain Dynamics Analysis Using Space-Time Domain Hypersurfaces and Gradient Trajectories Derived From Time Series of 3D Point Clouds

    DTIC Science & Technology

    2015-08-01

    optimized space-time interpolation method. Tangible geospatial modeling system was further developed to support the analysis of changing elevation surfaces...Evolution Mapped by Terrestrial Laser Scanning, talk, AGU Fall 2012 *Hardin E, Mitas L, Mitasova H., Simulation of Wind -Blown Sand for...Geomorphological Applications: A Smoothed Particle Hydrodynamics Approach, GSA 2012 *Russ, E. Mitasova, H., Time series and space-time cube analyses on

  3. A New Efficient Algorithm for the All Sorting Reversals Problem with No Bad Components.

    PubMed

    Wang, Biing-Feng

    2016-01-01

    The problem of finding all reversals that take a permutation one step closer to a target permutation is called the all sorting reversals problem (the ASR problem). For this problem, Siepel had an O(n (3))-time algorithm. Most complications of his algorithm stem from some peculiar structures called bad components. Since bad components are very rare in both real and simulated data, it is practical to study the ASR problem with no bad components. For the ASR problem with no bad components, Swenson et al. gave an O (n(2))-time algorithm. Very recently, Swenson found that their algorithm does not always work. In this paper, a new algorithm is presented for the ASR problem with no bad components. The time complexity is O(n(2)) in the worst case and is linear in the size of input and output in practice.

  4. SU-E-T-510: Interplay Between Spots Sizes, Spot / Line Spacing and Motion in Spot Scanning Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, TK

    Purpose In proton beam configuration for spot scanning proton therapy (SSPT), one can define the spacing between spots and lines of scanning as a ratio of given spot size. If the spacing increases, the number of spots decreases which can potentially decrease scan time, and so can whole treatment time, and vice versa. However, if the spacing is too large, the uniformity of scanned field decreases. Also, the field uniformity can be affected by motion during SSPT beam delivery. In the present study, the interplay between spot/ line spacing and motion is investigated. Methods We used four Gaussian-shape spot sizesmore » with 0.5cm, 1.0cm, 1.5cm, and 2.0cm FWHM, three spot/line spacing that creates uniform field profile which are 1/3*FWHM, σ/3*FWHM and 2/3*FWHM, and three random motion amplitudes within, +/−0.3mm, +/−0.5mm, and +/−1.0mm. We planned with 2Gy uniform single layer of 10×10cm2 and 20×20cm2 fields. Then, mean dose within 80% area of given field size, contrubuting MU per each spot assuming 1cGy/MU calibration for all spot sizes, number of spots and uniformity were calculated. Results The plans with spot/line spacing equal to or smaller than 2/3*FWHM without motion create ∼100% uniformity. However, it was found that the uniformity decreases with increased spacing, and it is more pronounced with smaller spot sizes, but is not affected by scanned field sizes. Conclusion It was found that the motion during proton beam delivery can alter the dose uniformity and the amount of alteration changes with spot size which changes with energy and spot/line spacing. Currently, robust evaluation in TPS (e.g. Eclipse system) performs range uncertainty evaluation using isocenter shift and CT calibration error. Based on presented study, it is recommended to add interplay effect evaluation to robust evaluation process. For future study, the additional interplay between the energy layers and motion is expected to present volumetric effect.« less

  5. Improved Potential Energy Surface of Ozone Constructed Using the Fitting by Permutationally Invariant Polynomial Function

    DOE PAGES

    Ayouz, Mehdi; Babikov, Dmitri

    2012-01-01

    New global potential energy surface for the ground electronic state of ozone is constructed at the complete basis set level of the multireference configuration interaction theory. A method of fitting the data points by analytical permutationally invariant polynomial function is adopted. A small set of 500 points is preoptimized using the old surface of ozone. In this procedure the positions of points in the configuration space are chosen such that the RMS deviation of the fit is minimized. New ab initio calculations are carried out at these points and are used to build new surface. Additional points are added tomore » the vicinity of the minimum energy path in order to improve accuracy of the fit, particularly in the region where the surface of ozone exhibits a shallow van der Waals well. New surface can be used to study formation of ozone at thermal energies and its spectroscopy near the dissociation threshold.« less

  6. Four-point functions and the permutation group S4

    NASA Astrophysics Data System (ADS)

    Eichmann, Gernot; Fischer, Christian S.; Heupel, Walter

    2015-09-01

    Four-point functions are at the heart of many interesting physical processes. A prime example is the light-by-light scattering amplitude, which plays an important role in the calculation of hadronic contributions to the anomalous magnetic moment of the muon. In the calculation of such quantities one faces the challenge of finding a suitable and well-behaved basis of tensor structures in coordinate and/or momentum space. Provided all (or many) of the external legs represent similar particle content, a powerful tool to construct and organize such bases is the permutation group S4. We introduce an efficient notation for dealing with the irreducible multiplets of S4, and we highlight the merits of this treatment by exemplifying four-point functions with gauge-boson legs such as the four-gluon vertex and the light-by-light scattering amplitude. The multiplet analysis is also useful for isolating the important kinematic regions and the dynamical singularity content of such amplitudes. Our analysis serves as a basis for future efficient calculations of these and similar objects.

  7. Security scheme in IMDD-OFDM-PON system with the chaotic pilot interval and scrambling

    NASA Astrophysics Data System (ADS)

    Chen, Qianghua; Bi, Meihua; Fu, Xiaosong; Lu, Yang; Zeng, Ran; Yang, Guowei; Yang, Xuelin; Xiao, Shilin

    2018-01-01

    In this paper, a random chaotic pilot interval and permutations scheme without any requirement of redundant sideband information is firstly proposed for the physical layer security-enhanced intensity modulation direct detection orthogonal frequency division multiplexing passive optical network (IMDD-OFDM-PON) system. With the help of the position feature of inserting the pilot, a simple logistic chaos map is used to generate the random pilot interval and scramble the chaotic subcarrier allocation of each column pilot data for improving the physical layer confidentiality. Due to the dynamic chaotic permutations of pilot data, the enhanced key space of ∼103303 is achieved in OFDM-PON. Moreover, the transmission experiment of 10-Gb/s 16-QAM encrypted OFDM data is successfully demonstrated over 20-km single-mode fiber, which indicates that the proposed scheme not only improves the system security, but also can achieve the same performance as in the common IMDD-OFDM-PON system without encryption scheme.

  8. Efficiency and credit ratings: a permutation-information-theory analysis

    NASA Astrophysics Data System (ADS)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  9. On the Shapley Value of Unrooted Phylogenetic Trees.

    PubMed

    Wicke, Kristina; Fischer, Mareike

    2018-01-17

    The Shapley value, a solution concept from cooperative game theory, has recently been considered for both unrooted and rooted phylogenetic trees. Here, we focus on the Shapley value of unrooted trees and first revisit the so-called split counts of a phylogenetic tree and the Shapley transformation matrix that allows for the calculation of the Shapley value from the edge lengths of a tree. We show that non-isomorphic trees may have permutation-equivalent Shapley transformation matrices and permutation-equivalent null spaces. This implies that estimating the split counts associated with a tree or the Shapley values of its leaves does not suffice to reconstruct the correct tree topology. We then turn to the use of the Shapley value as a prioritization criterion in biodiversity conservation and compare it to a greedy solution concept. Here, we show that for certain phylogenetic trees, the Shapley value may fail as a prioritization criterion, meaning that the diversity spanned by the top k species (ranked by their Shapley values) cannot approximate the total diversity of all n species.

  10. Using R to Simulate Permutation Distributions for Some Elementary Experimental Designs

    ERIC Educational Resources Information Center

    Eudey, T. Lynn; Kerr, Joshua D.; Trumbo, Bruce E.

    2010-01-01

    Null distributions of permutation tests for two-sample, paired, and block designs are simulated using the R statistical programming language. For each design and type of data, permutation tests are compared with standard normal-theory and nonparametric tests. These examples (often using real data) provide for classroom discussion use of metrics…

  11. DEEP ATTRACTOR NETWORK FOR SINGLE-MICROPHONE SPEAKER SEPARATION.

    PubMed

    Chen, Zhuo; Luo, Yi; Mesgarani, Nima

    2017-03-01

    Despite the overwhelming success of deep learning in various speech processing tasks, the problem of separating simultaneous speakers in a mixture remains challenging. Two major difficulties in such systems are the arbitrary source permutation and unknown number of sources in the mixture. We propose a novel deep learning framework for single channel speech separation by creating attractor points in high dimensional embedding space of the acoustic signals which pull together the time-frequency bins corresponding to each source. Attractor points in this study are created by finding the centroids of the sources in the embedding space, which are subsequently used to determine the similarity of each bin in the mixture to each source. The network is then trained to minimize the reconstruction error of each source by optimizing the embeddings. The proposed model is different from prior works in that it implements an end-to-end training, and it does not depend on the number of sources in the mixture. Two strategies are explored in the test time, K-means and fixed attractor points, where the latter requires no post-processing and can be implemented in real-time. We evaluated our system on Wall Street Journal dataset and show 5.49% improvement over the previous state-of-the-art methods.

  12. Typhoid fever acquired in the United States, 1999-2010: epidemiology, microbiology, and use of a space-time scan statistic for outbreak detection.

    PubMed

    Imanishi, M; Newton, A E; Vieira, A R; Gonzalez-Aviles, G; Kendall Scott, M E; Manikonda, K; Maxwell, T N; Halpin, J L; Freeman, M M; Medalla, F; Ayers, T L; Derado, G; Mahon, B E; Mintz, E D

    2015-08-01

    Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space-time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space-time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space-time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.

  13. Circular permutation of a WW domain: Folding still occurs after excising the turn of the folding-nucleating hairpin

    PubMed Central

    Kier, Brandon L.; Anderson, Jordan M.; Andersen, Niels H.

    2014-01-01

    A hyperstable Pin1 WW domain has been circularly permuted via excision of the fold-nucleating turn; it still folds to form the native three-strand sheet and hydrophobic core features. Multiprobe folding dynamics studies of the normal and circularly permuted sequences, as well as their constituent hairpin fragments and comparable-length β-strand-loop-β-strand models, indicate 2-state folding for all topologies. N-terminal hairpin formation is the fold nucleating event for the wild-type sequence; the slower folding circular permutant has a more distributed folding transition state. PMID:24350581

  14. A multiagent evolutionary algorithm for constraint satisfaction problems.

    PubMed

    Liu, Jing; Zhong, Weicai; Jiao, Licheng

    2006-02-01

    With the intrinsic properties of constraint satisfaction problems (CSPs) in mind, we divide CSPs into two types, namely, permutation CSPs and nonpermutation CSPs. According to their characteristics, several behaviors are designed for agents by making use of the ability of agents to sense and act on the environment. These behaviors are controlled by means of evolution, so that the multiagent evolutionary algorithm for constraint satisfaction problems (MAEA-CSPs) results. To overcome the disadvantages of the general encoding methods, the minimum conflict encoding is also proposed. Theoretical analyzes show that MAEA-CSPs has a linear space complexity and converges to the global optimum. The first part of the experiments uses 250 benchmark binary CSPs and 79 graph coloring problems from the DIMACS challenge to test the performance of MAEA-CSPs for nonpermutation CSPs. MAEA-CSPs is compared with six well-defined algorithms and the effect of the parameters is analyzed systematically. The second part of the experiments uses a classical CSP, n-queen problems, and a more practical case, job-shop scheduling problems (JSPs), to test the performance of MAEA-CSPs for permutation CSPs. The scalability of MAEA-CSPs along n for n-queen problems is studied with great care. The results show that MAEA-CSPs achieves good performance when n increases from 10(4) to 10(7), and has a linear time complexity. Even for 10(7)-queen problems, MAEA-CSPs finds the solutions by only 150 seconds. For JSPs, 59 benchmark problems are used, and good performance is also obtained.

  15. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    to the query graph, or subgraph permutations with the same mismatch cost (often the case for homogeneous and/or symmetrical data/query). To avoid...decisions are generated in a bottom-up manner using the metric of entropy at the cluster level (Figure 9c). Using the definition of belief messages...for a cluster and a set of data nodes in this cluster , we compute the entropy for forward and backward messages as (,) = −∑ (

  16. Reducible boundary conditions in coupled channels

    NASA Astrophysics Data System (ADS)

    Pankrashkin, Konstantin

    2005-10-01

    We study Hamiltonians with point interactions in spaces of vector-valued functions. Using some information from the theory of quantum graphs, we describe a class of the operators which can be reduced to the direct sum of several one-dimensional problems. It shown that such a reduction is closely connected with the invariance under channel permutations. Examples are provided by some 'model' interactions, in particular, the so-called δ, δ' and the Kirchhoff couplings.

  17. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation.

    PubMed

    Ohuchi, Shoji J; Sagawa, Fumihiko; Sakamoto, Taiichi; Inoue, Tan

    2015-10-23

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. The results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Linear models: permutation methods

    USGS Publications Warehouse

    Cade, B.S.; Everitt, B.S.; Howell, D.C.

    2005-01-01

    Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...

  19. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohuchi, Shoji J.; Sagawa, Fumihiko; Sakamoto, Taiichi

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. Themore » results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique.« less

  20. Using permutation tests to enhance causal inference in interrupted time series analysis.

    PubMed

    Linden, Ariel

    2018-06-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robustness check based on permutation tests to further improve causal inference. We evaluate the effect of California's Proposition 99 for reducing cigarette sales by iteratively casting each nontreated state into the role of "treated," creating a comparable control group using the ITSAMATCH package in Stata, and then evaluating treatment effects using ITSA regression. If statistically significant "treatment effects" are estimated for pseudotreated states, then any significant changes in the outcome of the actual treatment unit (California) cannot be attributed to the intervention. We perform these analyses setting the cutpoint significance level to P > .40 for identifying balanced matches (the highest threshold possible for which controls could still be found for California) and use the difference in differences of trends as the treatment effect estimator. Only California attained a statistically significant treatment effect, strengthening confidence in the conclusion that Proposition 99 reduced cigarette sales. The proposed permutation testing framework provides an additional robustness check to either support or refute a treatment effect identified in for the true treated unit in ITSA. Given its value and ease of implementation, this framework should be considered as a standard robustness test in all multiple group interrupted time series analyses. © 2018 John Wiley & Sons, Ltd.

  1. Development of a time-trend model for analyzing and predicting case-pattern of Lassa fever epidemics in Liberia, 2013-2017.

    PubMed

    Olugasa, Babasola O; Odigie, Eugene A; Lawani, Mike; Ojo, Johnson F

    2015-01-01

    The objective was to develop a case-pattern model for Lassa fever (LF) among humans and derive predictors of time-trend point distribution of LF cases in Liberia in view of the prevailing under-reporting and public health challenge posed by the disease in the country. A retrospective 5 years data of LF distribution countrywide among humans were used to train a time-trend model of the disease in Liberia. A time-trend quadratic model was selected due to its goodness-of-fit (R2 = 0.89, and P < 0.05) and best performance compared to linear and exponential models. Parameter predictors were run on least square method to predict LF cases for a prospective 5 years period, covering 2013-2017. The two-stage predictive model of LF case-pattern between 2013 and 2017 was characterized by a prospective decline within the South-coast County of Grand Bassa over the forecast period and an upward case-trend within the Northern County of Nimba. Case specific exponential increase was predicted for the first 2 years (2013-2014) with a geometric increase over the next 3 years (2015-2017) in Nimba County. This paper describes a translational application of the space-time distribution pattern of LF epidemics, 2008-2012 reported in Liberia, on which a predictive model was developed. We proposed a computationally feasible two-stage space-time permutation approach to estimate the time-trend parameters and conduct predictive inference on LF in Liberia.

  2. Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool

    NASA Technical Reports Server (NTRS)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  3. Generalized composite multiscale permutation entropy and Laplacian score based rolling bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zheng, Jinde; Pan, Haiyang; Yang, Shubao; Cheng, Junsheng

    2018-01-01

    Multiscale permutation entropy (MPE) is a recently proposed nonlinear dynamic method for measuring the randomness and detecting the nonlinear dynamic change of time series and can be used effectively to extract the nonlinear dynamic fault feature from vibration signals of rolling bearing. To solve the drawback of coarse graining process in MPE, an improved MPE method called generalized composite multiscale permutation entropy (GCMPE) was proposed in this paper. Also the influence of parameters on GCMPE and its comparison with the MPE are studied by analyzing simulation data. GCMPE was applied to the fault feature extraction from vibration signal of rolling bearing and then based on the GCMPE, Laplacian score for feature selection and the Particle swarm optimization based support vector machine, a new fault diagnosis method for rolling bearing was put forward in this paper. Finally, the proposed method was applied to analyze the experimental data of rolling bearing. The analysis results show that the proposed method can effectively realize the fault diagnosis of rolling bearing and has a higher fault recognition rate than the existing methods.

  4. Permutation inference for the general linear model

    PubMed Central

    Winkler, Anderson M.; Ridgway, Gerard R.; Webster, Matthew A.; Smith, Stephen M.; Nichols, Thomas E.

    2014-01-01

    Permutation methods can provide exact control of false positives and allow the use of non-standard statistics, making only weak assumptions about the data. With the availability of fast and inexpensive computing, their main limitation would be some lack of flexibility to work with arbitrary experimental designs. In this paper we report on results on approximate permutation methods that are more flexible with respect to the experimental design and nuisance variables, and conduct detailed simulations to identify the best method for settings that are typical for imaging research scenarios. We present a generic framework for permutation inference for complex general linear models (glms) when the errors are exchangeable and/or have a symmetric distribution, and show that, even in the presence of nuisance effects, these permutation inferences are powerful while providing excellent control of false positives in a wide range of common and relevant imaging research scenarios. We also demonstrate how the inference on glm parameters, originally intended for independent data, can be used in certain special but useful cases in which independence is violated. Detailed examples of common neuroimaging applications are provided, as well as a complete algorithm – the “randomise” algorithm – for permutation inference with the glm. PMID:24530839

  5. Modulation of a protein free-energy landscape by circular permutation.

    PubMed

    Radou, Gaël; Enciso, Marta; Krivov, Sergei; Paci, Emanuele

    2013-11-07

    Circular permutations usually retain the native structure and function of a protein while inevitably perturbing its folding dynamics. By using simulations with a structure-based model and a rigorous methodology to determine free-energy surfaces from trajectories, we evaluate the effect of a circular permutation on the free-energy landscape of the protein T4 lysozyme. We observe changes which, although subtle, largely affect the cooperativity between the two subdomains. Such a change in cooperativity has been previously experimentally observed and recently also characterized using single molecule optical tweezers and the Crooks relation. The free-energy landscapes show that both the wild type and circular permutant have an on-pathway intermediate, previously experimentally characterized, in which one of the subdomains is completely formed. The landscapes, however, differ in the position of the rate-limiting step for folding, which occurs before the intermediate in the wild type and after in the circular permutant. This shift of transition state explains the observed change in the cooperativity. The underlying free-energy landscape thus provides a microscopic description of the folding dynamics and the connection between circular permutation and the loss of cooperativity experimentally observed.

  6. PNT Activities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Sands, Obed

    2017-01-01

    This presentation provides a review of Position Navigation and Timing activities at the Glenn Research Center. Topics include 1) contributions to simulation studies for the Space Service Volume of the Global Navigation Satellite System, 2) development and integration efforts for a Software Defined Radio (SDR) waveform for the Space Communications and Navigation (SCaN) testbed, currently onboard the International Space Station and 3) a GPS L5 testbed intended to explore terrain mapping capabilities with communications signals. Future directions are included and a brief discussion of NASA, GRC and the SCAN office.

  7. ICV Echo Ultrasound Scan

    NASA Image and Video Library

    2012-12-31

    View of Integrated Cardiovascular (ICV) Echo Ultrasound Scan,in the Columbus module. ICV aims to quantify the extent,time course and clinical significance of cardiac atrophy (decrease in the size of the heart muscle) in space. Photo was taken during Expedition 34.

  8. Toward a general theory of conical intersections in systems of identical nuclei

    NASA Astrophysics Data System (ADS)

    Keating, Sean P.; Mead, C. Alden

    1987-02-01

    It has been shown previously that the Herzberg-Longuet-Higgins sign change produced in Born-Oppenheimer electronic wave functions when the nuclei traverse a closed path around a conical intersection has implications for the symmetry of wave functions under permutations of identical nuclei. For systems of three or four identical nuclei, there are special features present which have facilitated the detailed analysis. The present paper reports progress toward a general theory for systems of n nuclei. For n=3 or 4, the two key functions which locate conical intersections and define compensating phase factors can conveniently be defined so as to transform under permutations according to a two-dimensional irreducible representation of the permutation group. Since such representations do not exist for n>4, we have chosen to develop a formalism in terms of lab-fixed electronic basis functions, and we show how to define the two key functions in principle. The functions so defined both turn out to be totally symmetric under permutations. We show how they can be used to define compensating phase factors so that all modified electronic wave functions are either totally symmetric or totally antisymmetric under permutations. A detailed analysis is made to cyclic permutations in the neighborhood of Dnh symmetry, which can be extended by continuity arguments to more general configurations, and criteria are obtained for sign changes. There is a qualitative discussion of the treatment of more general permutations.

  9. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much.

    PubMed

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance.

  10. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much

    PubMed Central

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance. PMID:28344429

  11. Permutation parity machines for neural cryptography.

    PubMed

    Reyes, Oscar Mauricio; Zimmermann, Karl-Heinz

    2010-06-01

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  12. Inference for Distributions over the Permutation Group

    DTIC Science & Technology

    2008-05-01

    world problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n...problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n! possibilities...the Krone ker (or Tensor ) Produ t Representation.In general, the Krone ker produ t representation is redu ible, and so it ande omposed into a dire t

  13. Students' Errors in Solving the Permutation and Combination Problems Based on Problem Solving Steps of Polya

    ERIC Educational Resources Information Center

    Sukoriyanto; Nusantara, Toto; Subanji; Chandra, Tjang Daniel

    2016-01-01

    This article was written based on the results of a study evaluating students' errors in problem solving of permutation and combination in terms of problem solving steps according to Polya. Twenty-five students were asked to do four problems related to permutation and combination. The research results showed that the students still did a mistake in…

  14. Permutation parity machines for neural cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyes, Oscar Mauricio; Escuela de Ingenieria Electrica, Electronica y Telecomunicaciones, Universidad Industrial de Santander, Bucaramanga; Zimmermann, Karl-Heinz

    2010-06-15

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  15. Discrete-time Quantum Walks via Interchange Framework and Memory in Quantum Evolution

    NASA Astrophysics Data System (ADS)

    Dimcovic, Zlatko

    One of the newer and rapidly developing approaches in quantum computing is based on "quantum walks," which are quantum processes on discrete space that evolve in either discrete or continuous time and are characterized by mixing of components at each step. The idea emerged in analogy with the classical random walks and stochastic techniques, but these unitary processes are very different even as they have intriguing similarities. This thesis is concerned with study of discrete-time quantum walks. The original motivation from classical Markov chains required for discrete-time quantum walks that one adds an auxiliary Hilbert space, unrelated to the one in which the system evolves, in order to be able to mix components in that space and then take the evolution steps accordingly (based on the state in that space). This additional, "coin," space is very often an internal degree of freedom like spin. We have introduced a general framework for construction of discrete-time quantum walks in a close analogy with the classical random walks with memory that is rather different from the standard "coin" approach. In this method there is no need to bring in a different degree of freedom, while the full state of the system is still described in the direct product of spaces (of states). The state can be thought of as an arrow pointing from the previous to the current site in the evolution, representing the one-step memory. The next step is then controlled by a single local operator assigned to each site in the space, acting quite like a scattering operator. This allows us to probe and solve some problems of interest that have not had successful approaches with "coined" walks. We construct and solve a walk on the binary tree, a structure of great interest but until our result without an explicit discrete time quantum walk, due to difficulties in managing coin spaces necessary in the standard approach. Beyond algorithmic interests, the model based on memory allows one to explore effects of history on the quantum evolution and the subtle emergence of classical features as "memory" is explicitly kept for additional steps. We construct and solve a walk with an additional correlation step, finding interesting new features. On the other hand, the fact that the evolution is driven entirely by a local operator, not involving additional spaces, enables us to choose the Fourier transform as an operator completely controlling the evolution. This in turn allows us to combine the quantum walk approach with Fourier transform based techniques, something decidedly not possible in classical computational physics. We are developing a formalism for building networks manageable by walks constructed with this framework, based on the surprising efficiency of our framework in discovering internals of a simple network that we so far solved. Finally, in line with our expectation that the field of quantum walks can take cues from the rich history of development of the classical stochastic techniques, we establish starting points for the work on non-Abelian quantum walks, with a particular quantum-walk analog of the classical "card shuffling," the walk on the permutation group. In summary, this thesis presents a new framework for construction of discrete time quantum walks, employing and exploring memoried nature of unitary evolution. It is applied to fully solving the problems of: A walk on the binary tree and exploration of the quantum-to-classical transition with increased correlation length (history). It is then used for simple network discovery, and to lay the groundwork for analysis of complex networks, based on combined power of efficient exploration of the Hilbert space (as a walk mixing components) and Fourier transformation (since we can choose this for the evolution operator). We hope to establish this as a general technique as its power would be unmatched by any approaches available in the classical computing. We also looked at the promising and challenging prospect of walks on non-Abelian structures by setting up the problem of "quantum card shuffling," a quantum walk on the permutation group. Relation to other work is thoroughly discussed throughout, along with examination of the context of our work and overviews of our current and future work.

  16. Tropical rain mapping radar on the Space Station

    NASA Technical Reports Server (NTRS)

    Im, Eastwood; Li, Fuk

    1989-01-01

    The conceptual design for a tropical rain mapping radar for flight on the manned Space Station is discussed. In this design the radar utilizes a narrow, dual-frequency (9.7 GHz and 24.1 GHz) beam, electronically scanned antenna to achieve high spatial (4 km) and vertical (250 m) resolutions and a relatively large (800 km) cross-track swath. An adaptive scan strategy will be used for better utilization of radar energy and dwell time. Such a system can detect precipitation at rates of up to 100 mm/hr with accuracies of roughly 15 percent. With the proposed space-time sampling strategy, the monthly averaged rainfall rate can be estimated to within 8 percent, which is essential for many climatological studies.

  17. Space-Derived Sewer Monitor

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The QuadraScan Longterm Flow Monitoring System is a second generation sewer monitor developed by American Digital Systems, Inc.'s founder Peter Petroff. Petroff, a former spacecraft instrumentation designer at Marshall Space Flight Center, used expertise based on principles acquired in Apollo and other NASA programs. QuadraScan borrows even more heavily from space technology, for example in its data acquisition and memory system derived from NASA satellites. "One-time" measurements are often plagued with substantial errors due to the flow of groundwater absorbed into the system. These system sizing errors stem from a basic informational deficiency: accurate, reliable data on how much water flows through a sewer system over a long period of time is very difficult to obtain. City officials are turning to "permanent," or long-term sewer monitoring systems. QuadraScan offers many advantages to city officials such as the early warning capability to effectively plan for city growth in order to avoid the crippling economic impact of bans on new sewer connections in effect in many cities today.

  18. Alignment-Independent Comparisons of Human Gastrointestinal Tract Microbial Communities in a Multidimensional 16S rRNA Gene Evolutionary Space▿

    PubMed Central

    Rudi, Knut; Zimonja, Monika; Kvenshagen, Bente; Rugtveit, Jarle; Midtvedt, Tore; Eggesbø, Merete

    2007-01-01

    We present a novel approach for comparing 16S rRNA gene clone libraries that is independent of both DNA sequence alignment and definition of bacterial phylogroups. These steps are the major bottlenecks in current microbial comparative analyses. We used direct comparisons of taxon density distributions in an absolute evolutionary coordinate space. The coordinate space was generated by using alignment-independent bilinear multivariate modeling. Statistical analyses for clone library comparisons were based on multivariate analysis of variance, partial least-squares regression, and permutations. Clone libraries from both adult and infant gastrointestinal tract microbial communities were used as biological models. We reanalyzed a library consisting of 11,831 clones covering complete colons from three healthy adults in addition to a smaller 390-clone library from infant feces. We show that it is possible to extract detailed information about microbial community structures using our alignment-independent method. Our density distribution analysis is also very efficient with respect to computer operation time, meeting the future requirements of large-scale screenings to understand the diversity and dynamics of microbial communities. PMID:17337554

  19. Multiscale permutation entropy analysis of laser beam wandering in isotropic turbulence.

    PubMed

    Olivares, Felipe; Zunino, Luciano; Gulich, Damián; Pérez, Darío G; Rosso, Osvaldo A

    2017-10-01

    We have experimentally quantified the temporal structural diversity from the coordinate fluctuations of a laser beam propagating through isotropic optical turbulence. The main focus here is on the characterization of the long-range correlations in the wandering of a thin Gaussian laser beam over a screen after propagating through a turbulent medium. To fulfill this goal, a laboratory-controlled experiment was conducted in which coordinate fluctuations of the laser beam were recorded at a sufficiently high sampling rate for a wide range of turbulent conditions. Horizontal and vertical displacements of the laser beam centroid were subsequently analyzed by implementing the symbolic technique based on ordinal patterns to estimate the well-known permutation entropy. We show that the permutation entropy estimations at multiple time scales evidence an interplay between different dynamical behaviors. More specifically, a crossover between two different scaling regimes is observed. We confirm a transition from an integrated stochastic process contaminated with electronic noise to a fractional Brownian motion with a Hurst exponent H=5/6 as the sampling time increases. Besides, we are able to quantify, from the estimated entropy, the amount of electronic noise as a function of the turbulence strength. We have also demonstrated that these experimental observations are in very good agreement with numerical simulations of noisy fractional Brownian motions with a well-defined crossover between two different scaling regimes.

  20. A secure transmission scheme of streaming media based on the encrypted control message

    NASA Astrophysics Data System (ADS)

    Li, Bing; Jin, Zhigang; Shu, Yantai; Yu, Li

    2007-09-01

    As the use of streaming media applications increased dramatically in recent years, streaming media security becomes an important presumption, protecting the privacy. This paper proposes a new encryption scheme in view of characteristics of streaming media and the disadvantage of the living method: encrypt the control message in the streaming media with the high security lever and permute and confuse the data which is non control message according to the corresponding control message. Here the so-called control message refers to the key data of the streaming media, including the streaming media header and the header of the video frame, and the seed key. We encrypt the control message using the public key encryption algorithm which can provide high security lever, such as RSA. At the same time we make use of the seed key to generate key stream, from which the permutation list P responding to GOP (group of picture) is derived. The plain text of the non-control message XORs the key stream and gets the middle cipher text. And then obtained one is permutated according to P. In contrast the decryption process is the inverse process of the above. We have set up a testbed for the above scheme and found our scheme is six to eight times faster than the conventional method. It can be applied not only between PCs but also between handheld devices.

  1. Hotspot detection using space-time scan statistics on children under five years of age in Depok

    NASA Astrophysics Data System (ADS)

    Verdiana, Miranti; Widyaningsih, Yekti

    2017-03-01

    Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.

  2. Determining distinct circuit in complete graphs using permutation

    NASA Astrophysics Data System (ADS)

    Karim, Sharmila; Ibrahim, Haslinda; Darus, Maizon Mohd

    2017-11-01

    A Half Butterfly Method (HBM) is a method introduced to construct the distinct circuits in complete graphs where used the concept of isomorphism. The Half Butterfly Method was applied in the field of combinatorics such as in listing permutations of n elements. However the method of determining distinct circuit using HBM for n > 4 is become tedious. Thus, in this paper, we present the method of generating distinct circuit using permutation.

  3. A Versatile Platform for Nanotechnology Based on Circular Permutation of a Chaperonin Protein

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; McMillan, Andrew; Trent, Jonathan; Chan, Suzanne; Mazzarella, Kellen; Li, Yi-Fen

    2004-01-01

    A number of protein complexes have been developed as nanoscale templates. These templates can be functionalized using the peptide sequences that bind inorganic materials. However, it is difficult to integrate peptides into a specific position within a protein template. Integrating intact proteins with desirable binding or catalytic activities is an even greater challenge. We present a general method for modifying protein templates using circular permutation so that additional peptide sequence can be added in a wide variety of specific locations. Circular permutation is a reordering of the polypeptide chain such that the original termini are joined and new termini are created elsewhere in the protein. New sequence can be joined to the protein termini without perturbing the protein structure and with minimal limitation on the size and conformation of the added sequence. We have used this approach to modify a chaperonin protein template, placing termini at five different locations distributed across the surface of the protein complex. These permutants are competent to form the double-ring structures typical of chaperonin proteins. The permuted double-rings also form the same assemblies as the unmodified protein. We fused a fluorescent protein to two representative permutants and demonstrated that it assumes its active structure and does not interfere with assembly of chaperonin double-rings.

  4. Spatiotemporal Permutation Entropy as a Measure for Complexity of Cardiac Arrhythmia

    NASA Astrophysics Data System (ADS)

    Schlemmer, Alexander; Berg, Sebastian; Lilienkamp, Thomas; Luther, Stefan; Parlitz, Ulrich

    2018-05-01

    Permutation entropy (PE) is a robust quantity for measuring the complexity of time series. In the cardiac community it is predominantly used in the context of electrocardiogram (ECG) signal analysis for diagnoses and predictions with a major application found in heart rate variability parameters. In this article we are combining spatial and temporal PE to form a spatiotemporal PE that captures both, complexity of spatial structures and temporal complexity at the same time. We demonstrate that the spatiotemporal PE (STPE) quantifies complexity using two datasets from simulated cardiac arrhythmia and compare it to phase singularity analysis and spatial PE (SPE). These datasets simulate ventricular fibrillation (VF) on a two-dimensional and a three-dimensional medium using the Fenton-Karma model. We show that SPE and STPE are robust against noise and demonstrate its usefulness for extracting complexity features at different spatial scales.

  5. Consultation sequencing of a hospital with multiple service points using genetic programming

    NASA Astrophysics Data System (ADS)

    Morikawa, Katsumi; Takahashi, Katsuhiko; Nagasawa, Keisuke

    2018-07-01

    A hospital with one consultation room operated by a physician and several examination rooms is investigated. Scheduled patients and walk-ins arrive at the hospital, each patient goes to the consultation room first, and some of them visit other service points before consulting the physician again. The objective function consists of the sum of three weighted average waiting times. The problem of sequencing patients for consultation is focused. To alleviate the stress of waiting, the consultation sequence is displayed. A dispatching rule is used to decide the sequence, and best rules are explored by genetic programming (GP). The simulation experiments indicate that the rules produced by GP can be reduced to simple permutations of queues, and the best permutation depends on the weight used in the objective function. This implies that a balanced allocation of waiting times can be achieved by ordering the priority among three queues.

  6. Retrospective space-time cluster analysis of whooping cough, re-emergence in Barcelona, Spain, 2000-2011.

    PubMed

    Solano, Rubén; Gómez-Barroso, Diana; Simón, Fernando; Lafuente, Sarah; Simón, Pere; Rius, Cristina; Gorrindo, Pilar; Toledo, Diana; Caylà, Joan A

    2014-05-01

    A retrospective, space-time study of whooping cough cases reported to the Public Health Agency of Barcelona, Spain between the years 2000 and 2011 is presented. It is based on 633 individual whooping cough cases and the 2006 population census from the Spanish National Statistics Institute, stratified by age and sex at the census tract level. Cluster identification was attempted using space-time scan statistic assuming a Poisson distribution and restricting temporal extent to 7 days and spatial distance to 500 m. Statistical calculations were performed with Stata 11 and SatScan and mapping was performed with ArcGis 10.0. Only clusters showing statistical significance (P <0.05) were mapped. The most likely cluster identified included five census tracts located in three neighbourhoods in central Barcelona during the week from 17 to 23 August 2011. This cluster included five cases compared with the expected level of 0.0021 (relative risk = 2436, P <0.001). In addition, 11 secondary significant space-time clusters were detected with secondary clusters occurring at different times and localizations. Spatial statistics is felt to be useful by complementing epidemiological surveillance systems through visualizing excess in the number of cases in space and time and thus increase the possibility of identifying outbreaks not reported by the surveillance system.

  7. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  8. Rank score and permutation testing alternatives for regression quantile estimates

    USGS Publications Warehouse

    Cade, B.S.; Richards, J.D.; Mielke, P.W.

    2006-01-01

    Performance of quantile rank score tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1) were evaluated by simulation for models with p = 2 and 6 predictors, moderate collinearity among predictors, homogeneous and hetero-geneous errors, small to moderate samples (n = 20–300), and central to upper quantiles (0.50–0.99). Test statistics evaluated were the conventional quantile rank score T statistic distributed as χ2 random variable with q degrees of freedom (where q parameters are constrained by H 0:) and an F statistic with its sampling distribution approximated by permutation. The permutation F-test maintained better Type I errors than the T-test for homogeneous error models with smaller n and more extreme quantiles τ. An F distributional approximation of the F statistic provided some improvements in Type I errors over the T-test for models with > 2 parameters, smaller n, and more extreme quantiles but not as much improvement as the permutation approximation. Both rank score tests required weighting to maintain correct Type I errors when heterogeneity under the alternative model increased to 5 standard deviations across the domain of X. A double permutation procedure was developed to provide valid Type I errors for the permutation F-test when null models were forced through the origin. Power was similar for conditions where both T- and F-tests maintained correct Type I errors but the F-test provided some power at smaller n and extreme quantiles when the T-test had no power because of excessively conservative Type I errors. When the double permutation scheme was required for the permutation F-test to maintain valid Type I errors, power was less than for the T-test with decreasing sample size and increasing quantiles. Confidence intervals on parameters and tolerance intervals for future predictions were constructed based on test inversion for an example application relating trout densities to stream channel width:depth.

  9. Identification of IL-7 as a candidate disease mediator in osteoarthritis in Chinese Han population: a case-control study.

    PubMed

    Zhang, Hong-Xin; Wang, Yan-Gui; Lu, Shun-Yuan; Lu, Xiong-Xiong; Liu, Jie

    2016-09-01

    Little is known about the biochemical mediators IL-7 that correlate with the initiation and progression of OA. We performed this study to assess the role of variants of IL-7 in OA susceptibility in the Chinese Han population. We performed a retrospective, case-control study in the Chinese Han population from 2013 to 2015. Four single nucleotide polymorphisms were genotyped (using a ligase detection reaction) in 602 patients and 454 controls. Differences between groups were analysed, and association was assessed by the odds ratio (OR) and 95% CI. Among these polymorphisms, rs2583764, rs2583760 and rs6993386 showed no significant association with OA in the Chinese Han population {rs2583764 [P-allele = 0.98651, P-genotype = 0.40392, OR (95% CI): 1.00162 (0.83066, 1.20775)]; rs2583760 [P-allele = 0.384500, P-genotype = 0.58752, OR (95% CI): 0.69859 (0.30996, 1.57449)]; rs6993386 [P-allele = 0.69525, P-genotype = 0.50712, OR (95% CI): 0.96432 (0.80406, 1.15653)]}. However, the results showed that the rs2583759 polymorphism was significantly associated with OA [P-allele = 0.00 P-genotype = 3.86 × 10(-30), OR (95% CI): 0.27794 (0.22407, 0.34476)], even when the 10 000 times permutation was performed (P-allele-permutation < 0.00010, P-genotype-permutation = 0.00010). Haplotype analyses showed A-G-A-C, A-G-A-T and G-G-G-C of rs2583764-rs2583760-rs6993386-rs2583759 were risk factors for OA, both before or after the 10 000 times permutation, indicating IL-7 to be associated with OA. There was a significant association between IL-7, especially rs2583759, and OA in the Chinese Han population. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Time needed to board an airplane: a power law and the structure behind it.

    PubMed

    Frette, Vidar; Hemmer, Per C

    2012-01-01

    A simple model for the boarding of an airplane is studied. Passengers have reserved seats but enter the airplane in arbitrary order. Queues are formed along the aisle, as some passengers have to wait to reach the seats for which they have reservation. We label a passenger by the number of his or her reserved seat. In most cases the boarding process is much slower than for the optimal situation, where passenger and seat orders are identical. We study this dynamical system by calculating the average boarding time when all permutations of N passengers are given equal weight. To first order, the boarding time for a given permutation (ordering) of the passengers is given by the number s of sequences of monotonically increasing values in the permutation. We show that the distribution of s is symmetric on [1,N], which leads to an average boarding time (N+1)/2. We have found an exact expression for s and have shown that the full distribution of s approaches a normal distribution as N increases. However, there are significant corrections to the first-order results, due to certain correlations between passenger ordering and the substrate (seat ordering). This occurs for some cases in which the sequence of the seats is partially mirrored in the passenger ordering. These cases with correlations have a boarding time that is lower than predicted by the first-order results. The large number of cases with reduced boarding times have been classified. We also give some indicative results on the geometry of the correlations, with sorting into geometry groups. With increasing N, both the number of correlation types and the number of cases belonging to each type increase rapidly. Using enumeration we find that as a result of these correlations the average boarding time behaves like N(α), with α≃0.69, as compared with α=1.0 for the first-order approximation. © 2012 American Physical Society

  11. Focal Gray Matter Plasticity as a Function of Long Duration Head Down Tilted Bed Rest: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Koppelmans, V.; Erdeniz, B.; DeDios, Y. E.; Wood, S. J.; Reuter-Lorenz, P. A.; Kofman, I.; Bloomberg, J. J.; Mulavara, A. P.; Seidler, R. D.

    2014-01-01

    Long duration spaceflight (i.e., 22 days or longer) has been associated with changes in sensorimotor systems, resulting in difficulties that astronauts experience with posture control, locomotion, and manual control. The microgravity environment is an important causal factor for spaceflight induced sensorimotor changes. Whether these sensorimotor changes are solely related to peripheral changes from reduced vestibular stimulation, body unloading, body fluid shifts or that they may be related to structural and functional brain changes is yet unknown. However, a recent study reported associations between microgravity and flattening of the posterior eye globe and protrusion of the optic nerve [1] possibly as the result of increased intracranial pressure due to microgravity induced bodily fluid shifts [3]. Moreover, elevated intracranial pressure has been related to white matter microstructural damage [2]. Thus, it is possible that spaceflight may affect brain structure and thereby cognitive functioning. Long duration head down tilt bed rest has been suggested as an exclusionary analog to study microgravity effects on the sensorimotor system [4]. Bed rest mimics microgravity in body unloading and bodily fluid shifts. In consideration of the health and performance of crewmembers both in- and post-flight, we are conducting a prospective longitudinal 70-day bed rest study as an analog to investigate the effects of microgravity on brain structure [5]. Here we present results of the first six subjects. Six subjects were assessed at 12 and 7 days before-, at 7, 30, and 70 days in-, and at 8 and 12 days post 70 days of bed rest at the NASA bed rest facility in UTMB, Galveston, TX, USA. At each time point structural MRI scans (i.e., high resolution T1-weighted imaging and Diffusion Tensor Imaging (DTI)) were obtained using a 3T Siemens scanner. Focal changes over time in gray matter density were assessed using the voxel based morphometry 8 (VBM8) toolbox under SPM. Longitudinal processing in VBM8 includes linear registration of each scan to the mean of the subject and subsequently transforming all scans in to MNI space by applying the warp from the mean subject to MNI to the individual gray matter segmentations. Modulation was applied so that all images represented the volume of the original structure in native space. Voxel wise analysis was carried out on the gray matter images after smoothing, using a flexible factorial design with family wise error correction. Focal changes in white matter microstructural integrity were assessed using tract based spatial statistics (TBSS) as part of FMRIB software library (FSL). TBSS registers all DTI scans to standard space. It subsequently creates a study specific white matter skeleton of the major white matter tracts. For each subject, for each DTI metric (i.e. fractional anisotropy (FA), mean diffusivity (MD), axial diffusivity (AD), and radial diffusivity (RD)), the maximum value in a line perpendicular to the skeleton tract is projected to the skeleton. Non-parametric permutation based t-tests and ANOVA's were used for voxel-wise comparison of the skeletons. For both VBM and TBSS, comparison of pre bed rest measurements did not show significant differences. VBM analysis revealed decreased gray matter density in bilateral areas including the frontal medial cortex, the insular cortex and the caudate (see Figure) from 'pre to in bed rest'. Over the same time period, there was an increase in gray matter density in the cerebellum, occipital-, and parietal cortex, including the precuneus (see Figure). The majority of these changes did not recover from 'during to post bed rest'. TBSS analysis did not reveal significant changes in white matter microstructural integrity after correction for multiple comparisons. Uncorrected analyses (p<.015) revealed an increase in RD in the cerebellum and brainstem from pre bed rest to the first week in bed rest that did not recover post bed rest. Extended bed rest, which is an analog for microgravity, can result in gray matter changes and potentially in microstructural white matter changes in areas that are important for neuro motor behavior and cognition. These changes did not recover at two weeks post bed rest. Whether the effects of bed rest wear off at longer times post bed rest, and if they are associated with behavior are important questions that warrant further research.

  12. Inter-satellite laser link acquisition with dual-way scanning for Space Advanced Gravity Measurements mission

    NASA Astrophysics Data System (ADS)

    Zhang, Jing-Yi; Ming, Min; Jiang, Yuan-Ze; Duan, Hui-Zong; Yeh, Hsien-Chi

    2018-06-01

    Laser link acquisition is a key technology for inter-satellite laser ranging and laser communication. In this paper, we present an acquisition scheme based on the differential power sensing method with dual-way scanning, which will be used in the next-generation gravity measurement mission proposed in China, called Space Advanced Gravity Measurements (SAGM). In this scheme, the laser beams emitted from two satellites are power-modulated at different frequencies to enable the signals of the two beams to be measured distinguishably, and their corresponding pointing angles are determined by using the differential power sensing method. As the master laser beam and the slave laser beam are decoupled, the dual-way scanning method, in which the laser beams of both the master and the slave satellites scan uncertainty cones simultaneously and independently, can be used, instead of the commonly used single-way scanning method, in which the laser beam of one satellite scans and that of the other one stares. Therefore, the acquisition time is reduced significantly. Numerical simulation and experiments of the acquisition process are performed using the design parameters of the SAGM mission. The results show that the average acquisition time is less than 10 s for a scanning range of 1-mrad radius with a success rate of more than 99%.

  13. A 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Elias, Isaac; Hartman, Tzvika

    2006-01-01

    Sorting permutations by transpositions is an important problem in genome rearrangements. A transposition is a rearrangement operation in which a segment is cut out of the permutation and pasted in a different location. The complexity of this problem is still open and it has been a 10-year-old open problem to improve the best known 1.5-approximation algorithm. In this paper, we provide a 1.375-approximation algorithm for sorting by transpositions. The algorithm is based on a new upper bound on the diameter of 3-permutations. In addition, we present some new results regarding the transposition diameter: we improve the lower bound for the transposition diameter of the symmetric group and determine the exact transposition diameter of simple permutations.

  14. Permutational distribution of the log-rank statistic under random censorship with applications to carcinogenicity assays.

    PubMed

    Heimann, G; Neuhaus, G

    1998-03-01

    In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.

  15. Optimal wavelength-space crossbar switches for supercomputer optical interconnects.

    PubMed

    Roudas, Ioannis; Hemenway, B Roe; Grzybowski, Richard R; Karinou, Fotini

    2012-08-27

    We propose a most economical design of the Optical Shared MemOry Supercomputer Interconnect System (OSMOSIS) all-optical, wavelength-space crossbar switch fabric. It is shown, by analysis and simulation, that the total number of on-off gates required for the proposed N × N switch fabric can scale asymptotically as N ln N if the number of input/output ports N can be factored into a product of small primes. This is of the same order of magnitude as Shannon's lower bound for switch complexity, according to which the minimum number of two-state switches required for the construction of a N × N permutation switch is log2 (N!).

  16. Permutation on hybrid natural inflation

    NASA Astrophysics Data System (ADS)

    Carone, Christopher D.; Erlich, Joshua; Ramos, Raymundo; Sher, Marc

    2014-09-01

    We analyze a model of hybrid natural inflation based on the smallest non-Abelian discrete group S3. Leading invariant terms in the scalar potential have an accidental global symmetry that is spontaneously broken, providing a pseudo-Goldstone boson that is identified as the inflaton. The S3 symmetry restricts both the form of the inflaton potential and the couplings of the inflaton field to the waterfall fields responsible for the end of inflation. We identify viable points in the model parameter space. Although the power in tensor modes is small in most of the parameter space of the model, we identify parameter choices that yield potentially observable values of r without super-Planckian initial values of the inflaton field.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orlenko, E. V., E-mail: eorlenko@mail.ru; Evstafev, A. V.; Orlenko, F. E.

    A formalism of exchange perturbation theory (EPT) is developed for the case of interactions that explicitly depend on time. Corrections to the wave function obtained in any order of perturbation theory and represented in an invariant form include exchange contributions due to intercenter electron permutations in complex multicenter systems. For collisions of atomic systems with an arbitrary type of interaction, general expressions are obtained for the transfer (T) and scattering (S) matrices in which intercenter electron permutations between overlapping nonorthogonal states belonging to different centers (atoms) are consistently taken into account. The problem of collision of alpha particles with lithiummore » atoms accompanied by the redistribution of electrons between centers is considered. The differential and total charge-exchange cross sections of lithium are calculated.« less

  18. Optimal recombination in genetic algorithms for flowshop scheduling problems

    NASA Astrophysics Data System (ADS)

    Kovalenko, Julia

    2016-10-01

    The optimal recombination problem consists in finding the best possible offspring as a result of a recombination operator in a genetic algorithm, given two parent solutions. We prove NP-hardness of the optimal recombination for various variants of the flowshop scheduling problem with makespan criterion and criterion of maximum lateness. An algorithm for solving the optimal recombination problem for permutation flowshop problems is built, using enumeration of prefect matchings in a special bipartite graph. The algorithm is adopted for the classical flowshop scheduling problem and for the no-wait flowshop problem. It is shown that the optimal recombination problem for the permutation flowshop scheduling problem is solvable in polynomial time for almost all pairs of parent solutions as the number of jobs tends to infinity.

  19. Estrogen pathway polymorphisms in relation to primary open angle glaucoma: An analysis accounting for gender from the United States

    PubMed Central

    Loomis, Stephanie J.; Weinreb, Robert N.; Kang, Jae H.; Yaspan, Brian L.; Bailey, Jessica Cooke; Gaasterland, Douglas; Gaasterland, Terry; Lee, Richard K.; Scott, William K.; Lichter, Paul R.; Budenz, Donald L.; Liu, Yutao; Realini, Tony; Friedman, David S.; McCarty, Catherine A.; Moroi, Sayoko E.; Olson, Lana; Schuman, Joel S.; Singh, Kuldev; Vollrath, Douglas; Wollstein, Gadi; Zack, Donald J.; Brilliant, Murray; Sit, Arthur J.; Christen, William G.; Fingert, John; Kraft, Peter; Zhang, Kang; Allingham, R. Rand; Pericak-Vance, Margaret A.; Richards, Julia E.; Hauser, Michael A.; Haines, Jonathan L.; Wiggs, Janey L.

    2013-01-01

    Purpose Circulating estrogen levels are relevant in glaucoma phenotypic traits. We assessed the association between an estrogen metabolism single nucleotide polymorphism (SNP) panel in relation to primary open angle glaucoma (POAG), accounting for gender. Methods We included 3,108 POAG cases and 3,430 controls of both genders from the Glaucoma Genes and Environment (GLAUGEN) study and the National Eye Institute Glaucoma Human Genetics Collaboration (NEIGHBOR) consortium genotyped on the Illumina 660W-Quad platform. We assessed the relation between the SNP panels representative of estrogen metabolism and POAG using pathway- and gene-based approaches with the Pathway Analysis by Randomization Incorporating Structure (PARIS) software. PARIS executes a permutation algorithm to assess statistical significance relative to the pathways and genes of comparable genetic architecture. These analyses were performed using the meta-analyzed results from the GLAUGEN and NEIGHBOR data sets. We evaluated POAG overall as well as two subtypes of POAG defined as intraocular pressure (IOP) ≥22 mmHg (high-pressure glaucoma [HPG]) or IOP <22 mmHg (normal pressure glaucoma [NPG]) at diagnosis. We conducted these analyses for each gender separately and then jointly in men and women. Results Among women, the estrogen SNP pathway was associated with POAG overall (permuted p=0.006) and HPG (permuted p<0.001) but not NPG (permuted p=0.09). Interestingly, there was no relation between the estrogen SNP pathway and POAG when men were considered alone (permuted p>0.99). Among women, gene-based analyses revealed that the catechol-O-methyltransferase gene showed strong associations with HTG (permuted gene p≤0.001) and NPG (permuted gene p=0.01). Conclusions The estrogen SNP pathway was associated with POAG among women. PMID:23869166

  20. Error-free holographic frames encryption with CA pixel-permutation encoding algorithm

    NASA Astrophysics Data System (ADS)

    Li, Xiaowei; Xiao, Dan; Wang, Qiong-Hua

    2018-01-01

    The security of video data is necessary in network security transmission hence cryptography is technique to make video data secure and unreadable to unauthorized users. In this paper, we propose a holographic frames encryption technique based on the cellular automata (CA) pixel-permutation encoding algorithm. The concise pixel-permutation algorithm is used to address the drawbacks of the traditional CA encoding methods. The effectiveness of the proposed video encoding method is demonstrated by simulation examples.

  1. Learning to Predict Combinatorial Structures

    NASA Astrophysics Data System (ADS)

    Vembu, Shankar

    2009-12-01

    The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.

  2. Exploiting Identical Generators in Unit Commitment

    DOE PAGES

    Knueven, Ben; Ostrowski, Jim; Watson, Jean -Paul

    2017-12-14

    Here, we present sufficient conditions under which thermal generators can be aggregated in mixed-integer linear programming (MILP) formulations of the unit commitment (UC) problem, while maintaining feasibility and optimality for the original disaggregated problem. Aggregating thermal generators with identical characteristics (e.g., minimum/maximum power output, minimum up/down-time, and cost curves) into a single unit reduces redundancy in the search space induced by both exact symmetry (permutations of generator schedules) and certain classes of mutually non-dominated solutions. We study the impact of aggregation on two large-scale UC instances, one from the academic literature and another based on real-world operator data. Our computationalmore » tests demonstrate that when present, identical generators can negatively affect the performance of modern MILP solvers on UC formulations. Further, we show that our reformation of the UC MILP through aggregation is an effective method for mitigating this source of computational difficulty.« less

  3. Exploiting Identical Generators in Unit Commitment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knueven, Ben; Ostrowski, Jim; Watson, Jean -Paul

    Here, we present sufficient conditions under which thermal generators can be aggregated in mixed-integer linear programming (MILP) formulations of the unit commitment (UC) problem, while maintaining feasibility and optimality for the original disaggregated problem. Aggregating thermal generators with identical characteristics (e.g., minimum/maximum power output, minimum up/down-time, and cost curves) into a single unit reduces redundancy in the search space induced by both exact symmetry (permutations of generator schedules) and certain classes of mutually non-dominated solutions. We study the impact of aggregation on two large-scale UC instances, one from the academic literature and another based on real-world operator data. Our computationalmore » tests demonstrate that when present, identical generators can negatively affect the performance of modern MILP solvers on UC formulations. Further, we show that our reformation of the UC MILP through aggregation is an effective method for mitigating this source of computational difficulty.« less

  4. Photographs and Committees: Activities That Help Students Discover Permutations and Combinations.

    ERIC Educational Resources Information Center

    Szydlik, Jennifer Earles

    2000-01-01

    Presents problem situations that support students when discovering the multiplication principle, permutations, combinations, Pascal's triangle, and relationships among those objects in a concrete context. (ASK)

  5. Cone structure imaged with adaptive optics scanning laser ophthalmoscopy in eyes with nonneovascular age-related macular degeneration.

    PubMed

    Zayit-Soudry, Shiri; Duncan, Jacque L; Syed, Reema; Menghini, Moreno; Roorda, Austin J

    2013-11-15

    To evaluate cone spacing using adaptive optics scanning laser ophthalmoscopy (AOSLO) in eyes with nonneovascular AMD, and to correlate progression of AOSLO-derived cone measures with standard measures of macular structure. Adaptive optics scanning laser ophthalmoscopy images were obtained over 12 to 21 months from seven patients with AMD including four eyes with geographic atrophy (GA) and four eyes with drusen. Adaptive optics scanning laser ophthalmoscopy images were overlaid with color, infrared, and autofluorescence fundus photographs and spectral domain optical coherence tomography (SD-OCT) images to allow direct correlation of cone parameters with macular structure. Cone spacing was measured for each visit in selected regions including areas over drusen (n = 29), at GA margins (n = 14), and regions without drusen or GA (n = 13) and compared with normal, age-similar values. Adaptive optics scanning laser ophthalmoscopy imaging revealed continuous cone mosaics up to the GA edge and overlying drusen, although reduced cone reflectivity often resulted in hyporeflective AOSLO signals at these locations. Baseline cone spacing measures were normal in 13/13 unaffected regions, 26/28 drusen regions, and 12/14 GA margin regions. Although standard clinical measures showed progression of GA in all study eyes, cone spacing remained within normal ranges in most drusen regions and all GA margin regions. Adaptive optics scanning laser ophthalmoscopy provides adequate resolution for quantitative measurement of cone spacing at the margin of GA and over drusen in eyes with AMD. Although cone spacing was often normal at baseline and remained normal over time, these regions showed focal areas of decreased cone reflectivity. These findings may provide insight into the pathophysiology of AMD progression. (ClinicalTrials.gov number, NCT00254605).

  6. A permutation characterization of Sturm global attractors of Hamiltonian type

    NASA Astrophysics Data System (ADS)

    Fiedler, Bernold; Rocha, Carlos; Wolfrum, Matthias

    We consider Neumann boundary value problems of the form u=u+f on the interval 0⩽x⩽π for dissipative nonlinearities f=f(u). A permutation characterization for the global attractors of the semiflows generated by these equations is well known, even in the much more general case f=f(x,u,u). We present a permutation characterization for the global attractors in the restrictive class of nonlinearities f=f(u). In this class the stationary solutions of the parabolic equation satisfy the second order ODE v+f(v)=0 and we obtain the permutation characterization from a characterization of the set of 2 π-periodic orbits of this planar Hamiltonian system. Our results are based on a diligent discussion of this mere pendulum equation.

  7. An advanced scanning method for space-borne hyper-spectral imaging system

    NASA Astrophysics Data System (ADS)

    Wang, Yue-ming; Lang, Jun-Wei; Wang, Jian-Yu; Jiang, Zi-Qing

    2011-08-01

    Space-borne hyper-spectral imagery is an important means for the studies and applications of earth science. High cost efficiency could be acquired by optimized system design. In this paper, an advanced scanning method is proposed, which contributes to implement both high temporal and spatial resolution imaging system. Revisit frequency and effective working time of space-borne hyper-spectral imagers could be greatly improved by adopting two-axis scanning system if spatial resolution and radiometric accuracy are not harshly demanded. In order to avoid the quality degradation caused by image rotation, an idea of two-axis rotation has been presented based on the analysis and simulation of two-dimensional scanning motion path and features. Further improvement of the imagers' detection ability under the conditions of small solar altitude angle and low surface reflectance can be realized by the Ground Motion Compensation on pitch axis. The structure and control performance are also described. An intelligent integration technology of two-dimensional scanning and image motion compensation is elaborated in this paper. With this technology, sun-synchronous hyper-spectral imagers are able to pay quick visit to hot spots, acquiring both high spatial and temporal resolution hyper-spectral images, which enables rapid response of emergencies. The result has reference value for developing operational space-borne hyper-spectral imagers.

  8. PBOOST: a GPU-based tool for parallel permutation tests in genome-wide association studies.

    PubMed

    Yang, Guangyuan; Jiang, Wei; Yang, Qiang; Yu, Weichuan

    2015-05-01

    The importance of testing associations allowing for interactions has been demonstrated by Marchini et al. (2005). A fast method detecting associations allowing for interactions has been proposed by Wan et al. (2010a). The method is based on likelihood ratio test with the assumption that the statistic follows the χ(2) distribution. Many single nucleotide polymorphism (SNP) pairs with significant associations allowing for interactions have been detected using their method. However, the assumption of χ(2) test requires the expected values in each cell of the contingency table to be at least five. This assumption is violated in some identified SNP pairs. In this case, likelihood ratio test may not be applicable any more. Permutation test is an ideal approach to checking the P-values calculated in likelihood ratio test because of its non-parametric nature. The P-values of SNP pairs having significant associations with disease are always extremely small. Thus, we need a huge number of permutations to achieve correspondingly high resolution for the P-values. In order to investigate whether the P-values from likelihood ratio tests are reliable, a fast permutation tool to accomplish large number of permutations is desirable. We developed a permutation tool named PBOOST. It is based on GPU with highly reliable P-value estimation. By using simulation data, we found that the P-values from likelihood ratio tests will have relative error of >100% when 50% cells in the contingency table have expected count less than five or when there is zero expected count in any of the contingency table cells. In terms of speed, PBOOST completed 10(7) permutations for a single SNP pair from the Wellcome Trust Case Control Consortium (WTCCC) genome data (Wellcome Trust Case Control Consortium, 2007) within 1 min on a single Nvidia Tesla M2090 device, while it took 60 min in a single CPU Intel Xeon E5-2650 to finish the same task. More importantly, when simultaneously testing 256 SNP pairs for 10(7) permutations, our tool took only 5 min, while the CPU program took 10 h. By permuting on a GPU cluster consisting of 40 nodes, we completed 10(12) permutations for all 280 SNP pairs reported with P-values smaller than 1.6 × 10⁻¹² in the WTCCC datasets in 1 week. The source code and sample data are available at http://bioinformatics.ust.hk/PBOOST.zip. gyang@ust.hk; eeyu@ust.hk Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. The complexity of gene expression dynamics revealed by permutation entropy

    PubMed Central

    2010-01-01

    Background High complexity is considered a hallmark of living systems. Here we investigate the complexity of temporal gene expression patterns using the concept of Permutation Entropy (PE) first introduced in dynamical systems theory. The analysis of gene expression data has so far focused primarily on the identification of differentially expressed genes, or on the elucidation of pathway and regulatory relationships. We aim to study gene expression time series data from the viewpoint of complexity. Results Applying the PE complexity metric to abiotic stress response time series data in Arabidopsis thaliana, genes involved in stress response and signaling were found to be associated with the highest complexity not only under stress, but surprisingly, also under reference, non-stress conditions. Genes with house-keeping functions exhibited lower PE complexity. Compared to reference conditions, the PE of temporal gene expression patterns generally increased upon stress exposure. High-complexity genes were found to have longer upstream intergenic regions and more cis-regulatory motifs in their promoter regions indicative of a more complex regulatory apparatus needed to orchestrate their expression, and to be associated with higher correlation network connectivity degree. Arabidopsis genes also present in other plant species were observed to exhibit decreased PE complexity compared to Arabidopsis specific genes. Conclusions We show that Permutation Entropy is a simple yet robust and powerful approach to identify temporal gene expression profiles of varying complexity that is equally applicable to other types of molecular profile data. PMID:21176199

  10. Engineering calculations for solving the orbital allotment problem

    NASA Technical Reports Server (NTRS)

    Reilly, C.; Walton, E. K.; Mount-Campbell, C.; Caldecott, R.; Aebker, E.; Mata, F.

    1988-01-01

    Four approaches for calculating downlink interferences for shaped-beam antennas are described. An investigation of alternative mixed-integer programming models for satellite synthesis is summarized. Plans for coordinating the various programs developed under this grant are outlined. Two procedures for ordering satellites to initialize the k-permutation algorithm are proposed. Results are presented for the k-permutation algorithms. Feasible solutions are found for 5 of the 6 problems considered. Finally, it is demonstrated that the k-permutation algorithm can be used to solve arc allotment problems.

  11. A practical comparison of algorithms for the measurement of multiscale entropy in neural time series data.

    PubMed

    Kuntzelman, Karl; Jack Rhodes, L; Harrington, Lillian N; Miskovic, Vladimir

    2018-06-01

    There is a broad family of statistical methods for capturing time series regularity, with increasingly widespread adoption by the neuroscientific community. A common feature of these methods is that they permit investigators to quantify the entropy of brain signals - an index of unpredictability/complexity. Despite the proliferation of algorithms for computing entropy from neural time series data there is scant evidence concerning their relative stability and efficiency. Here we evaluated several different algorithmic implementations (sample, fuzzy, dispersion and permutation) of multiscale entropy in terms of their stability across sessions, internal consistency and computational speed, accuracy and precision using a combination of electroencephalogram (EEG) and synthetic 1/ƒ noise signals. Overall, we report fair to excellent internal consistency and longitudinal stability over a one-week period for the majority of entropy estimates, with several caveats. Computational timing estimates suggest distinct advantages for dispersion and permutation entropy over other entropy estimates. Considered alongside the psychometric evidence, we suggest several ways in which researchers can maximize computational resources (without sacrificing reliability), especially when working with high-density M/EEG data or multivoxel BOLD time series signals. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. II. Four-atom systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun; Jiang, Bin; Guo, Hua, E-mail: hguo@unm.edu

    2013-11-28

    A rigorous, general, and simple method to fit global and permutation invariant potential energy surfaces (PESs) using neural networks (NNs) is discussed. This so-called permutation invariant polynomial neural network (PIP-NN) method imposes permutation symmetry by using in its input a set of symmetry functions based on PIPs. For systems with more than three atoms, it is shown that the number of symmetry functions in the input vector needs to be larger than the number of internal coordinates in order to include both the primary and secondary invariant polynomials. This PIP-NN method is successfully demonstrated in three atom-triatomic reactive systems, resultingmore » in full-dimensional global PESs with average errors on the order of meV. These PESs are used in full-dimensional quantum dynamical calculations.« less

  13. Hemodynamic Response to Interictal Epileptiform Discharges Addressed by Personalized EEG-fNIRS Recordings

    PubMed Central

    Pellegrino, Giovanni; Machado, Alexis; von Ellenrieder, Nicolas; Watanabe, Satsuki; Hall, Jeffery A.; Lina, Jean-Marc; Kobayashi, Eliane; Grova, Christophe

    2016-01-01

    Objective: We aimed at studying the hemodynamic response (HR) to Interictal Epileptic Discharges (IEDs) using patient-specific and prolonged simultaneous ElectroEncephaloGraphy (EEG) and functional Near InfraRed Spectroscopy (fNIRS) recordings. Methods: The epileptic generator was localized using Magnetoencephalography source imaging. fNIRS montage was tailored for each patient, using an algorithm to optimize the sensitivity to the epileptic generator. Optodes were glued using collodion to achieve prolonged acquisition with high quality signal. fNIRS data analysis was handled with no a priori constraint on HR time course, averaging fNIRS signals to similar IEDs. Cluster-permutation analysis was performed on 3D reconstructed fNIRS data to identify significant spatio-temporal HR clusters. Standard (GLM with fixed HRF) and cluster-permutation EEG-fMRI analyses were performed for comparison purposes. Results: fNIRS detected HR to IEDs for 8/9 patients. It mainly consisted oxy-hemoglobin increases (seven patients), followed by oxy-hemoglobin decreases (six patients). HR was lateralized in six patients and lasted from 8.5 to 30 s. Standard EEG-fMRI analysis detected an HR in 4/9 patients (4/9 without enough IEDs, 1/9 unreliable result). The cluster-permutation EEG-fMRI analysis restricted to the region investigated by fNIRS showed additional strong and non-canonical BOLD responses starting earlier than the IEDs and lasting up to 30 s. Conclusions: (i) EEG-fNIRS is suitable to detect the HR to IEDs and can outperform EEG-fMRI because of prolonged recordings and greater chance to detect IEDs; (ii) cluster-permutation analysis unveils additional HR features underestimated when imposing a canonical HR function (iii) the HR is often bilateral and lasts up to 30 s. PMID:27047325

  14. Permutation entropy analysis of financial time series based on Hill's diversity number

    NASA Astrophysics Data System (ADS)

    Zhang, Yali; Shang, Pengjian

    2017-12-01

    In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.

  15. A chaotic cryptosystem for images based on Henon and Arnold cat map.

    PubMed

    Soleymani, Ali; Nordin, Md Jan; Sundararajan, Elankovan

    2014-01-01

    The rapid evolution of imaging and communication technologies has transformed images into a widespread data type. Different types of data, such as personal medical information, official correspondence, or governmental and military documents, are saved and transmitted in the form of images over public networks. Hence, a fast and secure cryptosystem is needed for high-resolution images. In this paper, a novel encryption scheme is presented for securing images based on Arnold cat and Henon chaotic maps. The scheme uses Arnold cat map for bit- and pixel-level permutations on plain and secret images, while Henon map creates secret images and specific parameters for the permutations. Both the encryption and decryption processes are explained, formulated, and graphically presented. The results of security analysis of five different images demonstrate the strength of the proposed cryptosystem against statistical, brute force and differential attacks. The evaluated running time for both encryption and decryption processes guarantee that the cryptosystem can work effectively in real-time applications.

  16. A Chaotic Cryptosystem for Images Based on Henon and Arnold Cat Map

    PubMed Central

    Sundararajan, Elankovan

    2014-01-01

    The rapid evolution of imaging and communication technologies has transformed images into a widespread data type. Different types of data, such as personal medical information, official correspondence, or governmental and military documents, are saved and transmitted in the form of images over public networks. Hence, a fast and secure cryptosystem is needed for high-resolution images. In this paper, a novel encryption scheme is presented for securing images based on Arnold cat and Henon chaotic maps. The scheme uses Arnold cat map for bit- and pixel-level permutations on plain and secret images, while Henon map creates secret images and specific parameters for the permutations. Both the encryption and decryption processes are explained, formulated, and graphically presented. The results of security analysis of five different images demonstrate the strength of the proposed cryptosystem against statistical, brute force and differential attacks. The evaluated running time for both encryption and decryption processes guarantee that the cryptosystem can work effectively in real-time applications. PMID:25258724

  17. [Optimal scan parameters for a method of k-space trajectory (radial scan method) in evaluation of carotid plaque characteristics].

    PubMed

    Nakamura, Manami; Makabe, Takeshi; Tezuka, Hideomi; Miura, Takahiro; Umemura, Takuma; Sugimori, Hiroyuki; Sakata, Motomichi

    2013-04-01

    The purpose of this study was to optimize scan parameters for evaluation of carotid plaque characteristics by k-space trajectory (radial scan method), using a custom-made carotid plaque phantom. The phantom was composed of simulated sternocleidomastoid muscle and four types of carotid plaque. The effect of chemical shift artifact was compared using T1 weighted images (T1WI) of the phantom obtained with and without fat suppression, and using two types of k-space trajectory (the radial scan method and the Cartesian method). The ratio of signal intensity of simulated sternocleidomastoid muscle to the signal intensity of hematoma, blood (including heparin), lard, and mayonnaise was compared among various repetition times (TR) using T1WI and T2 weighted imaging (T2WI). In terms of chemical shift artifacts, image quality was improved using fat suppression for both the radial scan and Cartesian methods. In terms of signal ratio, the highest values were obtained for the radial scan method with TR of 500 ms for T1WI, and TR of 3000 ms for T2WI. For evaluation of carotid plaque characteristics using the radial scan method, chemical shift artifacts were reduced with fat suppression. Signal ratio was improved by optimizing the TR settings for T1WI and T2WI. These results suggest the potential for using magnetic resonance imaging for detailed evaluation of carotid plaque.

  18. Multiple comparisons permutation test for image based data mining in radiotherapy.

    PubMed

    Chen, Chun; Witte, Marnix; Heemsbergen, Wilma; van Herk, Marcel

    2013-12-23

    : Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy.

  19. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  20. Permutation auto-mutual information of electroencephalogram in anesthesia

    NASA Astrophysics Data System (ADS)

    Liang, Zhenhu; Wang, Yinghua; Ouyang, Gaoxiang; Voss, Logan J.; Sleigh, Jamie W.; Li, Xiaoli

    2013-04-01

    Objective. The dynamic change of brain activity in anesthesia is an interesting topic for clinical doctors and drug designers. To explore the dynamical features of brain activity in anesthesia, a permutation auto-mutual information (PAMI) method is proposed to measure the information coupling of electroencephalogram (EEG) time series obtained in anesthesia. Approach. The PAMI is developed and applied on EEG data collected from 19 patients under sevoflurane anesthesia. The results are compared with the traditional auto-mutual information (AMI), SynchFastSlow (SFS, derived from the BIS index), permutation entropy (PE), composite PE (CPE), response entropy (RE) and state entropy (SE). Performance of all indices is assessed by pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability. Main results. The PK/PD modeling and prediction probability analysis show that the PAMI index correlates closely with the anesthetic effect. The coefficient of determination R2 between PAMI values and the sevoflurane effect site concentrations, and the prediction probability Pk are higher in comparison with other indices. The information coupling in EEG series can be applied to indicate the effect of the anesthetic drug sevoflurane on the brain activity as well as other indices. The PAMI of the EEG signals is suggested as a new index to track drug concentration change. Significance. The PAMI is a useful index for analyzing the EEG dynamics during general anesthesia.

  1. Concept for facilitating analyst-mediated interpretation of qualitative chromatographic-mass spectral data: an alternative to manual examination of extracted ion chromatograms.

    PubMed

    Borges, Chad R

    2007-07-01

    A chemometrics-based data analysis concept has been developed as a substitute for manual inspection of extracted ion chromatograms (XICs), which facilitates rapid, analyst-mediated interpretation of GC- and LC/MS(n) data sets from samples undergoing qualitative batchwise screening for prespecified sets of analytes. Automatic preparation of data into two-dimensional row space-derived scatter plots (row space plots) eliminates the need to manually interpret hundreds to thousands of XICs per batch of samples while keeping all interpretation of raw data directly in the hands of the analyst-saving great quantities of human time without loss of integrity in the data analysis process. For a given analyte, two analyte-specific variables are automatically collected by a computer algorithm and placed into a data matrix (i.e., placed into row space): the first variable is the ion abundance corresponding to scan number x and analyte-specific m/z value y, and the second variable is the ion abundance corresponding to scan number x and analyte-specific m/z value z (a second ion). These two variables serve as the two axes of the aforementioned row space plots. In order to collect appropriate scan number (retention time) information, it is necessary to analyze, as part of every batch, a sample containing a mixture of all analytes to be tested. When pure standard materials of tested analytes are unavailable, but representative ion m/z values are known and retention time can be approximated, data are evaluated based on two-dimensional scores plots from principal component analysis of small time range(s) of mass spectral data. The time-saving efficiency of this concept is directly proportional to the percentage of negative samples and to the total number of samples processed simultaneously.

  2. Molecular quantum control landscapes in von Neumann time-frequency phase space

    NASA Astrophysics Data System (ADS)

    Ruetzel, Stefan; Stolzenberger, Christoph; Fechner, Susanne; Dimler, Frank; Brixner, Tobias; Tannor, David J.

    2010-10-01

    Recently we introduced the von Neumann representation as a joint time-frequency description for femtosecond laser pulses and suggested its use as a basis for pulse shaping experiments. Here we use the von Neumann basis to represent multidimensional molecular control landscapes, providing insight into the molecular dynamics. We present three kinds of time-frequency phase space scanning procedures based on the von Neumann formalism: variation of intensity, time-frequency phase space position, and/or the relative phase of single subpulses. The shaped pulses produced are characterized via Fourier-transform spectral interferometry. Quantum control is demonstrated on the laser dye IR140 elucidating a time-frequency pump-dump mechanism.

  3. Molecular quantum control landscapes in von Neumann time-frequency phase space.

    PubMed

    Ruetzel, Stefan; Stolzenberger, Christoph; Fechner, Susanne; Dimler, Frank; Brixner, Tobias; Tannor, David J

    2010-10-28

    Recently we introduced the von Neumann representation as a joint time-frequency description for femtosecond laser pulses and suggested its use as a basis for pulse shaping experiments. Here we use the von Neumann basis to represent multidimensional molecular control landscapes, providing insight into the molecular dynamics. We present three kinds of time-frequency phase space scanning procedures based on the von Neumann formalism: variation of intensity, time-frequency phase space position, and/or the relative phase of single subpulses. The shaped pulses produced are characterized via Fourier-transform spectral interferometry. Quantum control is demonstrated on the laser dye IR140 elucidating a time-frequency pump-dump mechanism.

  4. Lorenz, Gödel and Penrose: new perspectives on determinism and causality in fundamental physics

    NASA Astrophysics Data System (ADS)

    Palmer, T. N.

    2014-07-01

    Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as Gödel's Incompleteness Theorem. Could such properties also provide new perspectives on deep unsolved issues in fundamental physics? Recent developments in cosmology motivate what is referred to as the 'cosmological invariant set postulate': that the universe ? can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set ? in its state space. Symbolic representations of ? are constructed explicitly based on permutation representations of quaternions. The resulting 'invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, while the cosmological invariant set appears to have a rich enough structure to allow a description of (quantum) probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, an alternative 'gravitational theory of the quantum' is proposed based on the geometry of ?, with new perspectives on the problem of black-hole information loss and potential observational consequences for the dark universe.

  5. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  6. Constrained Metric Learning by Permutation Inducing Isometries.

    PubMed

    Bosveld, Joel; Mahmood, Arif; Huynh, Du Q; Noakes, Lyle

    2016-01-01

    The choice of metric critically affects the performance of classification and clustering algorithms. Metric learning algorithms attempt to improve performance, by learning a more appropriate metric. Unfortunately, most of the current algorithms learn a distance function which is not invariant to rigid transformations of images. Therefore, the distances between two images and their rigidly transformed pair may differ, leading to inconsistent classification or clustering results. We propose to constrain the learned metric to be invariant to the geometry preserving transformations of images that induce permutations in the feature space. The constraint that these transformations are isometries of the metric ensures consistent results and improves accuracy. Our second contribution is a dimension reduction technique that is consistent with the isometry constraints. Our third contribution is the formulation of the isometry constrained logistic discriminant metric learning (IC-LDML) algorithm, by incorporating the isometry constraints within the objective function of the LDML algorithm. The proposed algorithm is compared with the existing techniques on the publicly available labeled faces in the wild, viewpoint-invariant pedestrian recognition, and Toy Cars data sets. The IC-LDML algorithm has outperformed existing techniques for the tasks of face recognition, person identification, and object classification by a significant margin.

  7. Scanning ultrafast electron microscopy.

    PubMed

    Yang, Ding-Shyue; Mohammed, Omar F; Zewail, Ahmed H

    2010-08-24

    Progress has been made in the development of four-dimensional ultrafast electron microscopy, which enables space-time imaging of structural dynamics in the condensed phase. In ultrafast electron microscopy, the electrons are accelerated, typically to 200 keV, and the microscope operates in the transmission mode. Here, we report the development of scanning ultrafast electron microscopy using a field-emission-source configuration. Scanning of pulses is made in the single-electron mode, for which the pulse contains at most one or a few electrons, thus achieving imaging without the space-charge effect between electrons, and still in ten(s) of seconds. For imaging, the secondary electrons from surface structures are detected, as demonstrated here for material surfaces and biological specimens. By recording backscattered electrons, diffraction patterns from single crystals were also obtained. Scanning pulsed-electron microscopy with the acquired spatiotemporal resolutions, and its efficient heat-dissipation feature, is now poised to provide in situ 4D imaging and with environmental capability.

  8. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    Flexible and extensible space communications architectures and technology are essential to enable future space exploration and science activities. NASA has championed the development of the Space Telecommunications Radio System (STRS) software defined radio (SDR) standard and the application of SDR technology to reduce the costs and risks of using SDRs for space missions, and has developed an on-orbit testbed to validate these capabilities. The Space Communications and Navigation (SCaN) Testbed (previously known as the Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT)) is advancing SDR, on-board networking, and navigation technologies by conducting space experiments aboard the International Space Station. During its first year(s) on-orbit, the SCaN Testbed has achieved considerable accomplishments to better understand SDRs and their applications. The SDR platforms and software waveforms on each SDR have over 1500 hours of operation and are performing as designed. The Ka-band SDR on the SCaN Testbed is NASAs first space Ka-band transceiver and is NASA's first Ka-band mission using the Space Network. This has provided exciting opportunities to operate at Ka-band and assist with on-orbit tests of NASA newest Tracking and Data Relay Satellites (TDRS). During its first year, SCaN Testbed completed its first on-orbit SDR reconfigurations. SDR reconfigurations occur when implementing new waveforms on an SDR. SDR reconfigurations allow a radio to change minor parameters, such as data rate, or complete functionality. New waveforms which provide new capability and are reusable across different missions provide long term value for reconfigurable platforms such as SDRs. The STRS Standard provides guidelines for new waveform development by third parties. Waveform development by organizations other than the platform provider offers NASA the ability to develop waveforms itself and reduce its dependence and costs on the platform developer. Each of these new waveforms requires a waveform build environment for the particular SDR, helps assess the usefulness of the platform provider documentation, and exercises the objectives of STRS Standard and the SCaN Testbed. There is considerable interest in conducting experiments using the SCaN Testbed from NASA, academia, commercial companies, and other space agencies. There are approximately 25 experiments or activities supported by the project underway or in development, with more proposals ready, as time and funding allow, and new experiment solicitations available. NASA continues development of new waveforms and applications in communications, networking, and navigation, the first university experimenters are beginning waveform development, which will support the next generation of communications engineers, and international interest is beginning with space agency partners from European Space Agency (ESA) and the Centre National d'Etudes Spatiales (CNES). This paper will provide an overview of the SCaN Testbed and discuss its recent accomplishments and experiment activities.Its recent successes in Ka-band operations, reception of the newest GPS signals, SDR reconfigurations, and STRS demonstration in space when combined with the future experiment portfolio have positioned the SCaN Testbed to enable future space communications and navigation capabilities for exploration and science.

  9. Sylow p-groups of polynomial permutations on the integers mod pn☆

    PubMed Central

    Frisch, Sophie; Krenn, Daniel

    2013-01-01

    We enumerate and describe the Sylow p-groups of the groups of polynomial permutations of the integers mod pn for n⩾1 and of the pro-finite group which is the projective limit of these groups. PMID:26869732

  10. Note on new KLT relations

    NASA Astrophysics Data System (ADS)

    Feng, Bo; He, Song; Huang, Rijun; Jia, Yin

    2010-10-01

    In this short note, we present two results about KLT relations discussed in recent several papers. Our first result is the re-derivation of Mason-Skinner MHV amplitude by applying the S n-3 permutation symmetric KLT relations directly to MHV amplitude. Our second result is the equivalence proof of the newly discovered S n-2 permutation symmetric KLT relations and the well-known S n-3 permutation symmetric KLT relations. Although both formulas have been shown to be correct by BCFW recursion relations, our result is the first direct check using the regularized definition of the new formula.

  11. Combating HER2-overexpressing breast cancer through induction of calreticulin exposure by Tras-Permut CrossMab

    PubMed Central

    Zhang, Fan; Zhang, Jie; Liu, Moyan; Zhao, Lichao; LingHu, RuiXia; Feng, Fan; Gao, Xudong; Jiao, Shunchang; Zhao, Lei; Hu, Yi; Yang, Junlan

    2015-01-01

    Although trastuzumab has succeeded in breast cancer treatment, acquired resistance is one of the prime obstacles for breast cancer therapies. There is an urgent need to develop novel HER2 antibodies against trastuzumab resistance. Here, we first rational designed avidity-imporved trastuzumab and pertuzumab variants, and explored the correlation between the binding avidity improvement and their antitumor activities. After characterization of a pertuzumab variant L56TY with potent antitumor activities, a bispecific immunoglobulin G-like CrossMab (Tras-Permut CrossMab) was generated from trastuzumab and binding avidity-improved pertuzumab variant L56TY. Although, the antitumor efficacy of trastuzumab was not enhanced by improving its binding avidity, binding avidity improvement could significantly increase the anti-proliferative and antibody-dependent cellular cytotoxicity (ADCC) activities of pertuzumab. Further studies showed that Tras-Permut CrossMab exhibited exceptional high efficiency to inhibit the progression of trastuzumab-resistant breast cancer. Notably, we found that calreticulin (CRT) exposure induced by Tras-Permut CrossMab was essential for induction of tumor-specific T cell immunity against tumor recurrence. These data indicated that simultaneous blockade of HER2 protein by Tras-Permut CrossMab could trigger CRT exposure and subsequently induce potent tumor-specific T cell immunity, suggesting it could be a promising therapeutic strategy against trastuzumab resistance. PMID:25949918

  12. Parallel approach on sorting of genes in search of optimal solution.

    PubMed

    Kumar, Pranav; Sahoo, G

    2018-05-01

    An important tool for comparing genome analysis is the rearrangement event that can transform one given genome into other. For finding minimum sequence of fission and fusion, we have proposed here an algorithm and have shown a transformation example for converting the source genome into the target genome. The proposed algorithm comprises of circular sequence i.e. "cycle graph" in place of mapping. The main concept of algorithm is based on optimal result of permutation. These sorting processes are performed in constant running time by showing permutation in the form of cycle. In biological instances it has been observed that transposition occurs half of the frequency as that of reversal. In this paper we are not dealing with reversal instead commencing with the rearrangement of fission, fusion as well as transposition. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Repelling Point Bosons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGuire, J. B.

    2011-12-01

    There is a body of conventional wisdom that holds that a solvable quantum problem, by virtue of its solvability, is pathological and thus irrelevant. It has been difficult to refute this view owing to the paucity of theoretical constructs and experimental results. Recent experiments involving equivalent ions trapped in a spatial conformation of extreme anisotropic confinement (longitudinal extension tens, hundreds or even thousands of times transverse extension) have modified the view of relevancy, and it is now possible to consider systems previously thought pathological, in particular point Bosons that repel in one dimension. It has been difficult for the experimentalistsmore » to utilize existing theory, mainly due to long-standing theoretical misunderstanding of the relevance of the permutation group, in particular the non-commutativity of translations (periodicity) and transpositions (permutation). This misunderstanding is most easily rectified in the case of repelling Bosons.« less

  14. Permutation Entropy Applied to Movement Behaviors of Drosophila Melanogaster

    NASA Astrophysics Data System (ADS)

    Liu, Yuedan; Chon, Tae-Soo; Baek, Hunki; Do, Younghae; Choi, Jin Hee; Chung, Yun Doo

    Movement of different strains in Drosophila melanogaster was continuously observed by using computer interfacing techniques and was analyzed by permutation entropy (PE) after exposure to toxic chemicals, toluene (0.1 mg/m3) and formaldehyde (0.01 mg/m3). The PE values based on one-dimensional time series position (vertical) data were variable according to internal constraint (i.e. strains) and accordingly increased in response to external constraint (i.e. chemicals) by reflecting diversity in movement patterns from both normal and intoxicated states. Cross-correlation function revealed temporal associations between the PE values and between the component movement patterns in different chemicals and strains through the period of intoxication. The entropy based on the order of position data could be a useful means for complexity measure in behavioral changes and for monitoring the impact of stressors in environment.

  15. Communication: Fitting potential energy surfaces with fundamental invariant neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energymore » surfaces for OH{sub 3} and CH{sub 4} were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations.« less

  16. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Siamidis, John; Yuko, Jim

    2014-01-01

    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  17. Space-based infrared scanning sensor LOS determination and calibration using star observation

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Xu, Zhan; An, Wei; Deng, Xin-Pu; Yang, Jun-Gang

    2015-10-01

    This paper provides a novel methodology for removing sensor bias from a space based infrared (IR) system (SBIRS) through the use of stars detected in the background field of the sensor. Space based IR system uses the LOS (line of sight) of target for target location. LOS determination and calibration is the key precondition of accurate location and tracking of targets in Space based IR system and the LOS calibration of scanning sensor is one of the difficulties. The subsequent changes of sensor bias are not been taking into account in the conventional LOS determination and calibration process. Based on the analysis of the imaging process of scanning sensor, a theoretical model based on the estimation of bias angles using star observation is proposed. By establishing the process model of the bias angles and the observation model of stars, using an extended Kalman filter (EKF) to estimate the bias angles, and then calibrating the sensor LOS. Time domain simulations results indicate that the proposed method has a high precision and smooth performance for sensor LOS determination and calibration. The timeliness and precision of target tracking process in the space based infrared (IR) tracking system could be met with the proposed algorithm.

  18. Automatic NEPHIS Coding of Descriptive Titles for Permuted Index Generation.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1982-01-01

    Describes a system for the automatic coding of most descriptive titles which generates Nested Phrase Indexing System (NEPHIS) input strings of sufficient quality for permuted index production. A series of examples and an 11-item reference list accompany the text. (JL)

  19. Satellite-based observations of tsunami-induced mesosphere airglow perturbations

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Ming; Verkhoglyadova, Olga; Mlynczak, Martin G.; Mannucci, Anthony J.; Meng, Xing; Langley, Richard B.; Hunt, Linda A.

    2017-01-01

    Tsunami-induced airglow emission perturbations were retrieved by using space-based measurements made by the Sounding of the Atmosphere using Broad-band Emission Radiometry (SABER) instrument on board the Thermosphere-Ionosphere-Mesosphere Energetics Dynamics spacecraft. At and after the time of the Tohoku-Oki earthquake on 11 March 2011, and the Chile earthquake on 16 September 2015, the spacecraft was performing scans over the Pacific Ocean. Significant ( 10% relative to the ambient emission profiles) and coherent nighttime airglow perturbations were observed in the mesosphere following Sounding of the Atmosphere using Broad-band Emission Radiometry limb scans intercepting tsunami-induced atmospheric gravity waves. Simulations of emission variations are consistent with the physical characteristics of the disturbances at the locations of the corresponding SABER scans. Airglow observations and model simulations suggest that atmospheric neutral density and temperature perturbations can lead to the observed amplitude variations and multipeak structures in the emission profiles. This is the first time that airglow emission rate perturbations associated with tsunamis have been detected with space-based measurements.

  20. Multipinhole SPECT helical scan parameters and imaging volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less

  1. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  2. Effects of space allowance on the behaviour of long-term housed shelter dogs.

    PubMed

    Normando, Simona; Contiero, Barbara; Marchesini, Giorgio; Ricci, Rebecca

    2014-03-01

    The aim of this study was to assess the effects of space allowance (4.5 m(2)/head vs. 9 m(2)/head) on the behaviour of shelter dogs (Canis familiaris) at different times of the day (from 10:30 to 13:30 vs. from 14:30 to 17:30), and the dogs' preference between two types of beds (fabric bed vs. plastic basket). Twelve neutered dogs (seven males and five females aged 3-8 years) housed in pairs were observed using a scan sampling recording method every 20 s for a total of 14,592 scans/treatment. An increase in space allowance increased general level of activity (risk ratio (RR)=1.34), standing (RR=1.37), positive social interactions (RR=2.14), visual exploration of the environment (RR=1.21), and vocalisations (RR=2.35). Dogs spent more time in the sitting (RR=1.39) or standing (RR=1.88) posture, in positive interactions (RR=1.85), and active visual exploration (RR=1.99) during the morning than in the afternoon. The dogs were more often observed in the fabric bed than in the plastic basket (53% vs. 15% of total scans, p<0.001). Results suggest that a 9.0 m(2)/head space allowance could be more beneficial to dogs than one of 4.5 m(2). Copyright © 2014 Elsevier B.V. All rights reserved.

  3. FPGA-based real-time swept-source OCT systems for B-scan live-streaming or volumetric imaging

    NASA Astrophysics Data System (ADS)

    Bandi, Vinzenz; Goette, Josef; Jacomet, Marcel; von Niederhäusern, Tim; Bachmann, Adrian H.; Duelk, Marcus

    2013-03-01

    We have developed a Swept-Source Optical Coherence Tomography (Ss-OCT) system with high-speed, real-time signal processing on a commercially available Data-Acquisition (DAQ) board with a Field-Programmable Gate Array (FPGA). The Ss-OCT system simultaneously acquires OCT and k-clock reference signals at 500MS/s. From the k-clock signal of each A-scan we extract a remap vector for the k-space linearization of the OCT signal. The linear but oversampled interpolation is followed by a 2048-point FFT, additional auxiliary computations, and a data transfer to a host computer for real-time, live-streaming of B-scan or volumetric C-scan OCT visualization. We achieve a 100 kHz A-scan rate by parallelization of our hardware algorithms, which run on standard and affordable, commercially available DAQ boards. Our main development tool for signal analysis as well as for hardware synthesis is MATLAB® with add-on toolboxes and 3rd-party tools.

  4. Creation of a Ligand-Dependent Enzyme by Fusing Circularly Permuted Antibody Variable Region Domains.

    PubMed

    Iwai, Hiroto; Kojima-Misaizu, Miki; Dong, Jinhua; Ueda, Hiroshi

    2016-04-20

    Allosteric control of enzyme activity with exogenous substances has been hard to achieve, especially using antibody domains that potentially allow control by any antigens of choice. Here, in order to attain this goal, we developed a novel antibody variable region format introduced with circular permutations, called Clampbody. The two variable-region domains of the antibone Gla protein (BGP) antibody were each circularly permutated to have novel termini at the loops near their domain interface. Through their attachment to the N- and C-termini of a circularly permutated TEM-1 β-lactamase (cpBLA), we created a molecular switch that responds to the antigen peptide. The fusion protein specifically recognized the antigen, and in the presence of some detergent or denaturant, its catalytic activity was enhanced up to 4.7-fold in an antigen-dependent manner, due to increased resistance to these reagents. Hence, Clampbody will be a powerful tool for the allosteric regulation of enzyme and other protein activities and especially useful to design robust biosensors.

  5. Multiple comparisons permutation test for image based data mining in radiotherapy

    PubMed Central

    2013-01-01

    Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy. PMID:24365155

  6. Techniques used for the analysis of oculometer eye-scanning data obtained from an air traffic control display

    NASA Technical Reports Server (NTRS)

    Crawford, Daniel J.; Burdette, Daniel W.; Capron, William R.

    1993-01-01

    The methodology and techniques used to collect and analyze look-point position data from a real-time ATC display-format comparison experiment are documented. That study compared the delivery precision and controller workload of three final approach spacing aid display formats. Using an oculometer, controller lookpoint position data were collected, associated with gaze objects (e.g., moving aircraft) on the ATC display, and analyzed to determine eye-scan behavior. The equipment involved and algorithms for saving, synchronizing with the ATC simulation output, and filtering the data are described. Target (gaze object) and cross-check scanning identification algorithms are also presented. Data tables are provided of total dwell times, average dwell times, and cross-check scans. Flow charts, block diagrams, file record descriptors, and source code are included. The techniques and data presented are intended to benefit researchers in other studies that incorporate non-stationary gaze objects and oculometer equipment.

  7. Discrete Bat Algorithm for Optimal Problem of Permutation Flow Shop Scheduling

    PubMed Central

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem. PMID:25243220

  8. Discrete bat algorithm for optimal problem of permutation flow shop scheduling.

    PubMed

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem.

  9. Levels of Conceptual Development in Melodic Permutation Concepts Based on Piaget's Theory

    ERIC Educational Resources Information Center

    Larn, Ronald L.

    1973-01-01

    Article considered different ways in which subjects at different age levels solved a musical task involving melodic permutation. The differences in responses to the musical task between age groups were judged to be compatible with Piaget's theory of cognitive development. (Author/RK)

  10. In Response to Rowland on "Realism and Debateability in Policy Advocacy."

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    1986-01-01

    Argues that Robert Rowland has overstated the case against the permutation process for assessing counterplan competitiveness. Claims that the permutation standard is a viable method for ascertaining counterplan competitiveness. Examines Rowland's alternative and argues that it is an unsatisfactory method for determining counterplan…

  11. Cardiac imaging with multi-sector data acquisition in volumetric CT: variation of effective temporal resolution and its potential clinical consequences

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Hsieh, Jiang; Taha, Basel H.; Vass, Melissa L.; Seamans, John L.; Okerlund, Darin R.

    2009-02-01

    With increasing longitudinal detector dimension available in diagnostic volumetric CT, step-and-shoot scan is becoming popular for cardiac imaging. In comparison to helical scan, step-and-shoot scan decouples patient table movement from cardiac gating/triggering, which facilitates the cardiac imaging via multi-sector data acquisition, as well as the administration of inter-cycle heart beat variation (arrhythmia) and radiation dose efficiency. Ideally, a multi-sector data acquisition can improve temporal resolution at a factor the same as the number of sectors (best scenario). In reality, however, the effective temporal resolution is jointly determined by gantry rotation speed and patient heart beat rate, which may significantly lower than the ideal or no improvement (worst scenario). Hence, it is clinically relevant to investigate the behavior of effective temporal resolution in cardiac imaging with multi-sector data acquisition. In this study, a 5-second cine scan of a porcine heart, which cascades 6 porcine cardiac cycles, is acquired. In addition to theoretical analysis and motion phantom study, the clinical consequences due to the effective temporal resolution variation are evaluated qualitative or quantitatively. By employing a 2-sector image reconstruction strategy, a total of 15 (the permutation of P(6, 2)) cases between the best and worst scenarios are studied, providing informative guidance for the design and optimization of CT cardiac imaging in volumetric CT with multi-sector data acquisition.

  12. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  13. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  14. High-resolution radiography by means of a hodoscope

    DOEpatents

    De Volpi, Alexander

    1978-01-01

    The fast neutron hodoscope, a device that produces neutron radiographs with coarse space resolution in a short time, is modified to produce neutron or gamma radiographs of relatively thick samples and with high space resolution. The modification comprises motorizing a neutron and gamma collimator to permit a controlled scanning pattern, simultaneous collection of data in a number of hodoscope channels over a period of time, and computerized image reconstruction of the data thus gathered.

  15. Hippocampal structure and human cognition: key role of spatial processing and evidence supporting the efficiency hypothesis in females

    PubMed Central

    Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martínez, Kenia; Hermel, David; Wang, Yalin; Álvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, MªÁngeles; Shih, Pei Chun; Thompson, Paul M.

    2014-01-01

    Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests corrected for multiple comparisons across vertices (p < .05) significant relationships were found for spatial intelligence, spatial working memory, and spatial executive control. Interactions with sex revealed significant relationships with the general factor of intelligence (g), along with abstract and spatial intelligence. These correlations were mainly positive for males but negative for females, which might support the efficiency hypothesis in women. Verbal intelligence, attention, and processing speed were not related to hippocampal structural differences. PMID:25632167

  16. A Scheme to Smooth Aggregated Traffic from Sensors with Periodic Reports

    PubMed Central

    Oh, Sungmin; Jang, Ju Wook

    2017-01-01

    The possibility of smoothing aggregated traffic from sensors with varying reporting periods and frame sizes to be carried on an access link is investigated. A straightforward optimization would take O(pn) time, whereas our heuristic scheme takes O(np) time where n, p denote the number of sensors and size of periods, respectively. Our heuristic scheme performs local optimization sensor by sensor, starting with the smallest to largest periods. This is based on an observation that sensors with large offsets have more choices in offsets to avoid traffic peaks than the sensors with smaller periods. A MATLAB simulation shows that our scheme excels the known scheme by M. Grenier et al. in a similar situation (aggregating periodic traffic in a controller area network) for almost all possible permutations. The performance of our scheme is very close to the straightforward optimization, which compares all possible permutations. We expect that our scheme would greatly contribute in smoothing the traffic from an ever-increasing number of IoT sensors to the gateway, reducing the burden on the access link to the Internet. PMID:28273831

  17. MCMC genome rearrangement.

    PubMed

    Miklós, István

    2003-10-01

    As more and more genomes have been sequenced, genomic data is rapidly accumulating. Genome-wide mutations are believed more neutral than local mutations such as substitutions, insertions and deletions, therefore phylogenetic investigations based on inversions, transpositions and inverted transpositions are less biased by the hypothesis on neutral evolution. Although efficient algorithms exist for obtaining the inversion distance of two signed permutations, there is no reliable algorithm when both inversions and transpositions are considered. Moreover, different type of mutations happen with different rates, and it is not clear how to weight them in a distance based approach. We introduce a Markov Chain Monte Carlo method to genome rearrangement based on a stochastic model of evolution, which can estimate the number of different evolutionary events needed to sort a signed permutation. The performance of the method was tested on simulated data, and the estimated numbers of different types of mutations were reliable. Human and Drosophila mitochondrial data were also analysed with the new method. The mixing time of the Markov Chain is short both in terms of CPU times and number of proposals. The source code in C is available on request from the author.

  18. A Scheme to Smooth Aggregated Traffic from Sensors with Periodic Reports.

    PubMed

    Oh, Sungmin; Jang, Ju Wook

    2017-03-03

    The possibility of smoothing aggregated traffic from sensors with varying reporting periods and frame sizes to be carried on an access link is investigated. A straightforward optimization would take O(pn) time, whereas our heuristic scheme takes O(np) time where n, p denote the number of sensors and size of periods, respectively. Our heuristic scheme performs local optimization sensor by sensor, starting with the smallest to largest periods. This is based on an observation that sensors with large offsets have more choices in offsets to avoid traffic peaks than the sensors with smaller periods. A MATLAB simulation shows that our scheme excels the known scheme by M. Grenier et al. in a similar situation (aggregating periodic traffic in a controller area network) for almost all possible permutations. The performance of our scheme is very close to the straightforward optimization, which compares all possible permutations. We expect that our scheme would greatly contribute in smoothing the traffic from an ever-increasing number of IoT sensors to the gateway, reducing the burden on the access link to the Internet.

  19. Trajectory Design Considerations for Exploration Mission 1

    NASA Technical Reports Server (NTRS)

    Dawn, Timothy F.; Gutkowski, Jeffrey P.; Batcha, Amelia L.; Williams, Jacob; Pedrotty, Samuel M.

    2018-01-01

    Exploration Mission 1 (EM-1) will be the first mission to send an uncrewed Orion Multi-Purpose Crew Vehicle (MPCV) to cislunar space in the fall of 2019. EM-1 was originally conceived as a lunar free-return mission, but was later changed to a Distant Retrograde Orbit (DRO) mission as a precursor to the Asteroid Redirect Mission. To understand the required mission performance (i.e., propellant requirement), a series of trajectory optimization runs was conducted using JSC's Copernicus spacecraft trajectory optimization tool. In order for the runs to be done in a timely manner, it was necessary to employ a parallelization approach on a computing cluster using a new trajectory scan tool written in Python. Details of the scan tool are provided and how it is used to perform the scans and post-process the results. Initially, a scan of daily due east launched EM-1 DRO missions in 2018 was made. Valid mission opportunities are ones that do not exceed the useable propellant available to perform the required burns. The initial scan data showed the propellant and delta-V performance patterns for each launch period. As questions were raised from different subsystems (e.g., power, thermal, communications, flight operations, etc.), the mission parameters or data that were of interest to them were added to the scan output data file. The additional data includes: (1) local launch and landing times in relation to sunrise and sunset, (2) length of eclipse periods during the in-space portion of the mission, (3) Earth line of sight from cislunar space, (4) Deep Space Network field of view looking towards cislunar space, and (5) variation of the downrange distance from Earth entry interface to splashdown. Mission design trades can also be performed based on the information that the additional data shows. For example, if the landing is in darkness, but the recovery operations team desires a landing in daylight, then an analysis is performed to determine how to change the mission design to meet this request. Also, subsystems request feasibility of alternate or contingency mission designs, such as adding an Orion main engine checkout burn or Orion completing all of its burns using only its auxiliary thrusters. This paper examines and presents the evolving trade studies that incorporate subsystem feedback and demonstrate the feasibility of these constrained mission trajectory designs and contingencies.

  20. SAR processing on the MPP

    NASA Technical Reports Server (NTRS)

    Batcher, K. E.; Eddey, E. E.; Faiss, R. O.; Gilmore, P. A.

    1981-01-01

    The processing of synthetic aperture radar (SAR) signals using the massively parallel processor (MPP) is discussed. The fast Fourier transform convolution procedures employed in the algorithms are described. The MPP architecture comprises an array unit (ARU) which processes arrays of data; an array control unit which controls the operation of the ARU and performs scalar arithmetic; a program and data management unit which controls the flow of data; and a unique staging memory (SM) which buffers and permutes data. The ARU contains a 128 by 128 array of bit-serial processing elements (PE). Two-by-four surarrays of PE's are packaged in a custom VLSI HCMOS chip. The staging memory is a large multidimensional-access memory which buffers and permutes data flowing with the system. Efficient SAR processing is achieved via ARU communication paths and SM data manipulation. Real time processing capability can be realized via a multiple ARU, multiple SM configuration.

  1. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  2. Signal processing applications of massively parallel charge domain computing devices

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Barhen, Jacob (Inventor); Toomarian, Nikzad (Inventor)

    1999-01-01

    The present invention is embodied in a charge coupled device (CCD)/charge injection device (CID) architecture capable of performing a Fourier transform by simultaneous matrix vector multiplication (MVM) operations in respective plural CCD/CID arrays in parallel in O(1) steps. For example, in one embodiment, a first CCD/CID array stores charge packets representing a first matrix operator based upon permutations of a Hartley transform and computes the Fourier transform of an incoming vector. A second CCD/CID array stores charge packets representing a second matrix operator based upon different permutations of a Hartley transform and computes the Fourier transform of an incoming vector. The incoming vector is applied to the inputs of the two CCD/CID arrays simultaneously, and the real and imaginary parts of the Fourier transform are produced simultaneously in the time required to perform a single MVM operation in a CCD/CID array.

  3. Credit market Jitters in the course of the financial crisis: A permutation entropy approach in measuring informational efficiency in financial assets

    NASA Astrophysics Data System (ADS)

    Siokis, Fotios M.

    2018-06-01

    We explore the evolution of the informational efficiency for specific instruments of the U.S. money, bond and stock exchange markets, prior and after the outbreak of the Great Recession. We utilize the permutation entropy and the complexity-entropy causality plane to rank the time series and measure the degree of informational efficiency. We find that after the credit crunch and the collapse of Lehman Brothers the efficiency level of specific money market instruments' yield falls considerably. This is an evidence of less uncertainty included in predicting the related yields throughout the financial disarray. Similar trend is depicted in the indices of the stock exchange markets but efficiency remains in much higher levels. On the other hand, bond market instruments maintained their efficiency levels even after the outbreak of the crisis, which could be interpreted into greater randomness and less predictability of their yields.

  4. Introduction to Permutation and Resampling-Based Hypothesis Tests

    ERIC Educational Resources Information Center

    LaFleur, Bonnie J.; Greevy, Robert A.

    2009-01-01

    A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…

  5. Explorations in Statistics: Permutation Methods

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2012-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…

  6. Towards a novel look on low-frequency climate reconstructions

    NASA Astrophysics Data System (ADS)

    Kamenik, Christian; Goslar, Tomasz; Hicks, Sheila; Barnekow, Lena; Huusko, Antti

    2010-05-01

    Information on low-frequency (millennial to sub-centennial) climate change is often derived from sedimentary archives, such as peat profiles or lake sediments. Usually, these archives have non-annual and varying time resolution. Their dating is mainly based on radionuclides, which provide probabilistic age-depth relationships with complex error structures. Dating uncertainties impede the interpretation of sediment-based climate reconstructions. They complicate the calculation of time-dependent rates. In most cases, they make any calibration in time impossible. Sediment-based climate proxies are therefore often presented as a single, best-guess time series without proper calibration and error estimation. Errors along time and dating errors that propagate into the calculation of time-dependent rates are neglected. Our objective is to overcome the aforementioned limitations by using a 'swarm' or 'ensemble' of reconstructions instead of a single best-guess. The novelty of our approach is to take into account age-depth uncertainties by permuting through a large number of potential age-depth relationships of the archive of interest. For each individual permutation we can then calculate rates, calibrate proxies in time, and reconstruct the climate-state variable of interest. From the resulting swarm of reconstructions, we can derive realistic estimates of even complex error structures. The likelihood of reconstructions is visualized by a grid of two-dimensional kernels that take into account probabilities along time and the climate-state variable of interest simultaneously. For comparison and regional synthesis, likelihoods can be scored against other independent climate time series.

  7. Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds

    NASA Technical Reports Server (NTRS)

    Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun

    2016-01-01

    The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earths surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.

  8. Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds

    NASA Technical Reports Server (NTRS)

    Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun

    2016-01-01

    The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earth's surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.

  9. Response versus scan-angle corrections for MODIS reflective solar bands using deep convective clouds

    NASA Astrophysics Data System (ADS)

    Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun

    2016-05-01

    The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the degradation of the SD over time, provides the baseline for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the background, respectively. The MODIS instrument views the Earth's surface using a two-sided scan mirror, whose reflectance is a function of the angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different AOIs. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two AOIs. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from the pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for select short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent at the beginning of the earth-view scan.

  10. On the rank-distance median of 3 permutations.

    PubMed

    Chindelevitch, Leonid; Pereira Zanetti, João Paulo; Meidanis, João

    2018-05-08

    Recently, Pereira Zanetti, Biller and Meidanis have proposed a new definition of a rearrangement distance between genomes. In this formulation, each genome is represented as a matrix, and the distance d is the rank distance between these matrices. Although defined in terms of matrices, the rank distance is equal to the minimum total weight of a series of weighted operations that leads from one genome to the other, including inversions, translocations, transpositions, and others. The computational complexity of the median-of-three problem according to this distance is currently unknown. The genome matrices are a special kind of permutation matrices, which we study in this paper. In their paper, the authors provide an [Formula: see text] algorithm for determining three candidate medians, prove the tight approximation ratio [Formula: see text], and provide a sufficient condition for their candidates to be true medians. They also conduct some experiments that suggest that their method is accurate on simulated and real data. In this paper, we extend their results and provide the following: Three invariants characterizing the problem of finding the median of 3 matrices A sufficient condition for uniqueness of medians that can be checked in O(n) A faster, [Formula: see text] algorithm for determining the median under this condition A new heuristic algorithm for this problem based on compressed sensing A [Formula: see text] algorithm that exactly solves the problem when the inputs are orthogonal matrices, a class that includes both permutations and genomes as special cases. Our work provides the first proof that, with respect to the rank distance, the problem of finding the median of 3 genomes, as well as the median of 3 permutations, is exactly solvable in polynomial time, a result which should be contrasted with its NP-hardness for the DCJ (double cut-and-join) distance and most other families of genome rearrangement operations. This result, backed by our experimental tests, indicates that the rank distance is a viable alternative to the DCJ distance widely used in genome comparisons.

  11. Application of dot-matrix illumination of liquid crystal phase space light modulator in 3D imaging of APD array

    NASA Astrophysics Data System (ADS)

    Wang, Shuai; Sun, Huayan; Guo, Huichao

    2018-01-01

    Aiming at the problem of beam scanning in low-resolution APD array in three-dimensional imaging, a method of beam scanning with liquid crystal phase-space optical modulator is proposed to realize high-resolution imaging by low-resolution APD array. First, a liquid crystal phase spatial light modulator is used to generate a beam array and then a beam array is scanned. Since the sub-beam divergence angle in the beam array is smaller than the field angle of a single pixel in the APD array, the APD's pixels respond only to the three-dimensional information of the beam illumination position. Through the scanning of the beam array, a single pixel is used to collect the target three-dimensional information multiple times, thereby improving the resolution of the APD detector. Finally, MATLAB is used to simulate the algorithm in this paper by using two-dimensional scalar diffraction theory, which realizes the splitting and scanning with a resolution of 5 x 5. The feasibility is verified theoretically.

  12. Epidemiology and spatio-temporal analysis of West Nile virus in horses in Spain between 2010 and 2016.

    PubMed

    García-Bocanegra, I; Belkhiria, J; Napp, S; Cano-Terriza, D; Jiménez-Ruiz, S; Martínez-López, B

    2018-04-01

    During the last decade, West Nile virus (WNV) outbreaks have increased sharply in both horses and human in Europe. The aims of this study were to evaluate characteristics and spatio-temporal distribution of WNV outbreaks in horses in Spain between 2010 and 2016 in order to identify the environmental variables most associated with WNV occurrence and to generate high-resolution WNV suitability maps to inform risk-based surveillance strategies in this country. Between August 2010 and November 2016, a total of 403 WNV suspected cases were investigated, of which, 177 (43.9%) were laboratory confirmed. Mean values of morbidity, mortality and case fatality rates were 7.5%, 1.6% and 21.2%, respectively. The most common clinical symptoms were as follows: tiredness/apathy, recumbency, muscular tremor, ataxia, incoordination and hyperaesthesia. The outbreaks confirmed during the last 7 years, with detection of WNV RNA lineage 1 in 2010, 2012, 2013, 2015 and 2016, suggest an endemic circulation of the virus in Spain. The spatio-temporal distribution of WNV outbreaks in Spain was not homogeneous, as most of them (92.7%) were concentrated in western part of Andalusia (southern Spain) and significant clusters were detected in this region in two non-consecutive years. These findings were supported by the results of the space-time scan statistics permutation model. A presence-only MaxEnt ecological niche model was used to generate a suitability map for WNV occurrence in Andalusia. The most important predictors selected by the Ecological Niche Modeling were as follows: mean annual temperature (49.5% contribution), presence of Culex pipiens (19.5% contribution), mean annual precipitation (16.1% contribution) and distance to Ramsar wetlands (14.9% contribution). Our results constitute an important step for understanding WNV emergence and spread in Spain and will provide valuable information for the development of more cost-effective surveillance and control programmes and improve the protection of horse and human populations in WNV-endemic areas. © 2017 Blackwell Verlag GmbH.

  13. Rotating-unbalanced-mass Devices for Scanning Balloon-borne Experiments, Free-flying Spacecraft, and Space Shuttle/space Station Experiments

    NASA Technical Reports Server (NTRS)

    Polites, Michael E.

    1990-01-01

    A new method is presented for scanning balloon-borne experiments, free-flying spacecraft, and gimballed experiments mounted to the space shuttle or the space station. It uses rotating-unbalanced-mass (RUM) devices for generating circular, line, or raster scan patterns and an auxiliary control system for target acquisition, keeping the scan centered on the target, and producing complementary motion for raster scanning. It is ideal for applications where the only possible way to accomplish the required scan is to physically scan the entire experiment or spacecraft as in x ray and gamma ray experiments. In such cases, this new method should have advantages over prior methods in terms of either power, weight, cost, performance, stability, or a combination of these.

  14. Medical data sheet in safe havens - A tri-layer cryptic solution.

    PubMed

    Praveenkumar, Padmapriya; Amirtharajan, Rengarajan; Thenmozhi, K; Balaguru Rayappan, John Bosco

    2015-07-01

    Secured sharing of the diagnostic reports and scan images of patients among doctors with complementary expertise for collaborative treatment will help to provide maximum care through faster and decisive decisions. In this context, a tri-layer cryptic solution has been proposed and implemented on Digital Imaging and Communications in Medicine (DICOM) images to establish a secured communication for effective referrals among peers without compromising the privacy of patients. In this approach, a blend of three cryptic schemes, namely Latin square image cipher (LSIC), discrete Gould transform (DGT) and Rubik׳s encryption, has been adopted. Among them, LSIC provides better substitution, confusion and shuffling of the image blocks; DGT incorporates tamper proofing with authentication; and Rubik renders a permutation of DICOM image pixels. The developed algorithm has been successfully implemented and tested in both the software (MATLAB 7) and hardware Universal Software Radio Peripheral (USRP) environments. Specifically, the encrypted data were tested by transmitting them through an additive white Gaussian noise (AWGN) channel model. Furthermore, the sternness of the implemented algorithm was validated by employing standard metrics such as the unified average changing intensity (UACI), number of pixels change rate (NPCR), correlation values and histograms. The estimated metrics have also been compared with the existing methods and dominate in terms of large key space to defy brute force attack, cropping attack, strong key sensitivity and uniform pixel value distribution on encryption. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Scanning ultrafast electron microscopy

    PubMed Central

    Yang, Ding-Shyue; Mohammed, Omar F.; Zewail, Ahmed H.

    2010-01-01

    Progress has been made in the development of four-dimensional ultrafast electron microscopy, which enables space-time imaging of structural dynamics in the condensed phase. In ultrafast electron microscopy, the electrons are accelerated, typically to 200 keV, and the microscope operates in the transmission mode. Here, we report the development of scanning ultrafast electron microscopy using a field-emission-source configuration. Scanning of pulses is made in the single-electron mode, for which the pulse contains at most one or a few electrons, thus achieving imaging without the space-charge effect between electrons, and still in ten(s) of seconds. For imaging, the secondary electrons from surface structures are detected, as demonstrated here for material surfaces and biological specimens. By recording backscattered electrons, diffraction patterns from single crystals were also obtained. Scanning pulsed-electron microscopy with the acquired spatiotemporal resolutions, and its efficient heat-dissipation feature, is now poised to provide in situ 4D imaging and with environmental capability. PMID:20696933

  16. A Real-Time High Performance Data Compression Technique For Space Applications

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu; Venbrux, Jack; Bhatia, Prakash; Miller, Warner H.

    2000-01-01

    A high performance lossy data compression technique is currently being developed for space science applications under the requirement of high-speed push-broom scanning. The technique is also error-resilient in that error propagation is contained within a few scan lines. The algorithm is based on block-transform combined with bit-plane encoding; this combination results in an embedded bit string with exactly the desirable compression rate. The lossy coder is described. The compression scheme performs well on a suite of test images typical of images from spacecraft instruments. Hardware implementations are in development; a functional chip set is expected by the end of 2001.

  17. Permutation modulation for quantization and information reconciliation in CV-QKD systems

    NASA Astrophysics Data System (ADS)

    Daneshgaran, Fred; Mondin, Marina; Olia, Khashayar

    2017-08-01

    This paper is focused on the problem of Information Reconciliation (IR) for continuous variable Quantum Key Distribution (QKD). The main problem is quantization and assignment of labels to the samples of the Gaussian variables observed at Alice and Bob. Trouble is that most of the samples, assuming that the Gaussian variable is zero mean which is de-facto the case, tend to have small magnitudes and are easily disturbed by noise. Transmission over longer and longer distances increases the losses corresponding to a lower effective Signal to Noise Ratio (SNR) exasperating the problem. Here we propose to use Permutation Modulation (PM) as a means of quantization of Gaussian vectors at Alice and Bob over a d-dimensional space with d ≫ 1. The goal is to achieve the necessary coding efficiency to extend the achievable range of continuous variable QKD by quantizing over larger and larger dimensions. Fractional bit rate per sample is easily achieved using PM at very reasonable computational cost. Ordered statistics is used extensively throughout the development from generation of the seed vector in PM to analysis of error rates associated with the signs of the Gaussian samples at Alice and Bob as a function of the magnitude of the observed samples at Bob.

  18. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The NASA Thesaurus -- Volume 2, Access Vocabulary -- contains an alphabetical listing of all Thesaurus terms (postable and nonpostable) and permutations of all multiword and pseudo-multiword terms. Also included are Other Words (non-Thesaurus terms) consisting of abbreviations, chemical symbols, etc. The permutations and Other Words provide 'access' to the appropriate postable entries in the Thesaurus.

  19. A Permutation Test for Correlated Errors in Adjacent Questionnaire Items

    ERIC Educational Resources Information Center

    Hildreth, Laura A.; Genschel, Ulrike; Lorenz, Frederick O.; Lesser, Virginia M.

    2013-01-01

    Response patterns are of importance to survey researchers because of the insight they provide into the thought processes respondents use to answer survey questions. In this article we propose the use of structural equation modeling to examine response patterns and develop a permutation test to quantify the likelihood of observing a specific…

  20. The Parity Theorem Shuffle

    ERIC Educational Resources Information Center

    Smith, Michael D.

    2016-01-01

    The Parity Theorem states that any permutation can be written as a product of transpositions, but no permutation can be written as a product of both an even number and an odd number of transpositions. Most proofs of the Parity Theorem take several pages of mathematical formalism to complete. This article presents an alternative but equivalent…

  1. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  2. Permutation Entropy and Signal Energy Increase the Accuracy of Neuropathic Change Detection in Needle EMG

    PubMed Central

    2018-01-01

    Background and Objective. Needle electromyography can be used to detect the number of changes and morphological changes in motor unit potentials of patients with axonal neuropathy. General mathematical methods of pattern recognition and signal analysis were applied to recognize neuropathic changes. This study validates the possibility of extending and refining turns-amplitude analysis using permutation entropy and signal energy. Methods. In this study, we examined needle electromyography in 40 neuropathic individuals and 40 controls. The number of turns, amplitude between turns, signal energy, and “permutation entropy” were used as features for support vector machine classification. Results. The obtained results proved the superior classification performance of the combinations of all of the above-mentioned features compared to the combinations of fewer features. The lowest accuracy from the tested combinations of features had peak-ratio analysis. Conclusion. Using the combination of permutation entropy with signal energy, number of turns and mean amplitude in SVM classification can be used to refine the diagnosis of polyneuropathies examined by needle electromyography. PMID:29606959

  3. Multi-scale symbolic transfer entropy analysis of EEG

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  4. Symmetric encryption algorithms using chaotic and non-chaotic generators: A review

    PubMed Central

    Radwan, Ahmed G.; AbdElHaleem, Sherif H.; Abd-El-Hafiz, Salwa K.

    2015-01-01

    This paper summarizes the symmetric image encryption results of 27 different algorithms, which include substitution-only, permutation-only or both phases. The cores of these algorithms are based on several discrete chaotic maps (Arnold’s cat map and a combination of three generalized maps), one continuous chaotic system (Lorenz) and two non-chaotic generators (fractals and chess-based algorithms). Each algorithm has been analyzed by the correlation coefficients between pixels (horizontal, vertical and diagonal), differential attack measures, Mean Square Error (MSE), entropy, sensitivity analyses and the 15 standard tests of the National Institute of Standards and Technology (NIST) SP-800-22 statistical suite. The analyzed algorithms include a set of new image encryption algorithms based on non-chaotic generators, either using substitution only (using fractals) and permutation only (chess-based) or both. Moreover, two different permutation scenarios are presented where the permutation-phase has or does not have a relationship with the input image through an ON/OFF switch. Different encryption-key lengths and complexities are provided from short to long key to persist brute-force attacks. In addition, sensitivities of those different techniques to a one bit change in the input parameters of the substitution key as well as the permutation key are assessed. Finally, a comparative discussion of this work versus many recent research with respect to the used generators, type of encryption, and analyses is presented to highlight the strengths and added contribution of this paper. PMID:26966561

  5. Development of isothermal-isobaric replica-permutation method for molecular dynamics and Monte Carlo simulations and its application to reveal temperature and pressure dependence of folded, misfolded, and unfolded states of chignolin

    NASA Astrophysics Data System (ADS)

    Yamauchi, Masataka; Okumura, Hisashi

    2017-11-01

    We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.

  6. A studentized permutation test for three-arm trials in the 'gold standard' design.

    PubMed

    Mütze, Tobias; Konietschke, Frank; Munk, Axel; Friede, Tim

    2017-03-15

    The 'gold standard' design for three-arm trials refers to trials with an active control and a placebo control in addition to the experimental treatment group. This trial design is recommended when being ethically justifiable and it allows the simultaneous comparison of experimental treatment, active control, and placebo. Parametric testing methods have been studied plentifully over the past years. However, these methods often tend to be liberal or conservative when distributional assumptions are not met particularly with small sample sizes. In this article, we introduce a studentized permutation test for testing non-inferiority and superiority of the experimental treatment compared with the active control in three-arm trials in the 'gold standard' design. The performance of the studentized permutation test for finite sample sizes is assessed in a Monte Carlo simulation study under various parameter constellations. Emphasis is put on whether the studentized permutation test meets the target significance level. For comparison purposes, commonly used Wald-type tests, which do not make any distributional assumptions, are included in the simulation study. The simulation study shows that the presented studentized permutation test for assessing non-inferiority in three-arm trials in the 'gold standard' design outperforms its competitors, for instance the test based on a quasi-Poisson model, for count data. The methods discussed in this paper are implemented in the R package ThreeArmedTrials which is available on the comprehensive R archive network (CRAN). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Memory as Perception of the Past: Compressed Time inMind and Brain.

    PubMed

    Howard, Marc W

    2018-02-01

    In the visual system retinal space is compressed such that acuity decreases further from the fovea. Different forms of memory may rely on a compressed representation of time, manifested as decreased accuracy for events that happened further in the past. Neurophysiologically, "time cells" show receptive fields in time. Analogous to the compression of visual space, time cells show less acuity for events further in the past. Behavioral evidence suggests memory can be accessed by scanning a compressed temporal representation, analogous to visual search. This suggests a common computational language for visual attention and memory retrieval. In this view, time functions like a scaffolding that organizes memories in much the same way that retinal space functions like a scaffolding for visual perception. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Around Marshall

    NASA Image and Video Library

    1995-07-08

    Marshall researchers, in the Astrionics lab, study rotating unbalanced mass devices. These require less power, and are lighter than current devices used for scanning images, a slice at a time. They have a wide range of space-based applications.

  9. Space-multiplexed optical scanner.

    PubMed

    Riza, Nabeel A; Yaqoob, Zahid

    2004-05-01

    A low-loss two-dimensional optical beam scanner that is capable of delivering large (e.g., > 10 degrees) angular scans along the elevation as well as the azimuthal direction is presented. The proposed scanner is based on a space-switched parallel-serial architecture that employs a coarse-scanner module and a fine-scanner module that produce an ultrahigh scan space-fill factor, e.g., 900 x 900 distinguishable beams in a 10 degrees (elevation) x 10 degrees (azimuth) scan space. The experimentally demonstrated one-dimensional version of the proposed scanner has a supercontinuous scan, 100 distinguishable beam spots in a 2.29 degrees total scan range, and 1.5-dB optical insertion loss.

  10. Focal Gray Matter Plasticity as a Function of Long Duration Bedrest: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Koppelmans, V.; Erdeniz, B.; De Dios, Y. E.; Wood, S. J.; Reuter-Lorenz, P. A.; Kofman, I.; Bloomberg, J. J.; Mulavara, A. P.; Seidler, R. D.

    2014-01-01

    Long duration spaceflight (i.e., 22 days or longer) has been associated with changes in sensorimotor systems, resulting in difficulties that astronauts experience with posture control, locomotion, and manual control. It is unknown whether and how spaceflight impacts sensorimotor brain structure and function, and whether such changes may potentially underlie behavioral effects. Long duration head down tilt bed rest has been used repeatedly as an exclusionary analog to study microgravity effects on the sensorimotor system [1]. Bed rest mimics microgravity in body unloading and bodily fluid shifts. We are currently testing sensorimotor function, brain structure, and brain function pre and post a 70-day bed rest period. We will acquire the same measures on NASA crewmembers starting in 2014. Here we present the results of the first eight bed rest subjects. Subjects were assessed at 12 and 7 days before-, at 7, 30, and 70 days in-, and at 8 and 12 days post 70 days of bed rest at the NASA bed rest facility, UTMB, Galveston, TX, USA. At each time point structural MRI scans (i.e., high resolution T1-weighted imaging and Diffusion Tensor Imaging (DTI)) were obtained using a 3T Siemens scanner. Focal changes over time in gray matter density were assessed using the voxel based morphometry 8 (VBM8) toolbox under SPM. Focal changes in white matter microstructural integrity were assessed using tract based spatial statistics (TBSS) as part of the FMRIB software library (FSL). TBSS registers all DTI scans to standard space. It subsequently creates a study specific white matter skeleton of the major white matter tracts. Non-parametric permutation based t-tests and ANOVA's were used for voxel-wise comparison of the skeletons. For both VBM and TBSS, comparison of the two pre bed rest measurements did not show significant differences. VBM analysis revealed decreased gray matter density in bilateral areas including the frontal medial cortex, the insular cortex and the caudate nucleus from pre to in bed rest. Over the same time period, there was an increase in gray matter density in the cerebellum, occipital, and parietal cortices. The majority of these changes did not recover from during to post bed rest. TBSS analyses will also be presented. Extended bed rest, which is an analog for microgravity, can result in gray matter changes and potentially in microstructural white matter changes in areas that are important for neuromotor behavior and cognition. These changes did not recover at two weeks post bed rest. These results have significant public health implications, and will also aid in interpretation of our future data obtained pre and post spaceflight. Whether the effects of bed rest wear off at longer times post bed rest, and if they are associated with behavior are important questions that warrant further research.

  11. Blackfolds, plane waves and minimal surfaces

    NASA Astrophysics Data System (ADS)

    Armas, Jay; Blau, Matthias

    2015-07-01

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  12. Clever imaging with SmartScan

    NASA Astrophysics Data System (ADS)

    Tchernykh, Valerij; Dyblenko, Sergej; Janschek, Klaus; Seifart, Klaus; Harnisch, Bernd

    2005-08-01

    The cameras commonly used for Earth observation from satellites require high attitude stability during the image acquisition. For some types of cameras (high-resolution "pushbroom" scanners in particular), instantaneous attitude changes of even less than one arcsecond result in significant image distortion and blurring. Especially problematic are the effects of high-frequency attitude variations originating from micro-shocks and vibrations produced by the momentum and reaction wheels, mechanically activated coolers, and steering and deployment mechanisms on board. The resulting high attitude-stability requirements for Earth-observation satellites are one of the main reasons for their complexity and high cost. The novel SmartScan imaging concept, based on an opto-electronic system with no moving parts, offers the promise of high-quality imaging with only moderate satellite attitude stability. SmartScan uses real-time recording of the actual image motion in the focal plane of the camera during frame acquisition to correct the distortions in the image. Exceptional real-time performances with subpixel-accuracy image-motion measurement are provided by an innovative high-speed onboard opto-electronic correlation processor. SmartScan will therefore allow pushbroom scanners to be used for hyper-spectral imaging from satellites and other space platforms not primarily intended for imaging missions, such as micro- and nano-satellites with simplified attitude control, low-orbiting communications satellites, and manned space stations.

  13. Real-time and encryption efficiency improvements of simultaneous fusion, compression and encryption method based on chaotic generators

    NASA Astrophysics Data System (ADS)

    Jridi, Maher; Alfalou, Ayman

    2018-03-01

    In this paper, enhancement of an existing optical simultaneous fusion, compression and encryption (SFCE) scheme in terms of real-time requirements, bandwidth occupation and encryption robustness is proposed. We have used and approximate form of the DCT to decrease the computational resources. Then, a novel chaos-based encryption algorithm is introduced in order to achieve the confusion and diffusion effects. In the confusion phase, Henon map is used for row and column permutations, where the initial condition is related to the original image. Furthermore, the Skew Tent map is employed to generate another random matrix in order to carry out pixel scrambling. Finally, an adaptation of a classical diffusion process scheme is employed to strengthen security of the cryptosystem against statistical, differential, and chosen plaintext attacks. Analyses of key space, histogram, adjacent pixel correlation, sensitivity, and encryption speed of the encryption scheme are provided, and favorably compared to those of the existing crypto-compression system. The proposed method has been found to be digital/optical implementation-friendly which facilitates the integration of the crypto-compression system on a very broad range of scenarios.

  14. Classification of Partial Discharge Signals by Combining Adaptive Local Iterative Filtering and Entropy Features

    PubMed Central

    Morison, Gordon; Boreham, Philip

    2018-01-01

    Electromagnetic Interference (EMI) is a technique for capturing Partial Discharge (PD) signals in High-Voltage (HV) power plant apparatus. EMI signals can be non-stationary which makes their analysis difficult, particularly for pattern recognition applications. This paper elaborates upon a previously developed software condition-monitoring model for improved EMI events classification based on time-frequency signal decomposition and entropy features. The idea of the proposed method is to map multiple discharge source signals captured by EMI and labelled by experts, including PD, from the time domain to a feature space, which aids in the interpretation of subsequent fault information. Here, instead of using only one permutation entropy measure, a more robust measure, called Dispersion Entropy (DE), is added to the feature vector. Multi-Class Support Vector Machine (MCSVM) methods are utilized for classification of the different discharge sources. Results show an improved classification accuracy compared to previously proposed methods. This yields to a successful development of an expert’s knowledge-based intelligent system. Since this method is demonstrated to be successful with real field data, it brings the benefit of possible real-world application for EMI condition monitoring. PMID:29385030

  15. Accelerated whole brain intracranial vessel wall imaging using black blood fast spin echo with compressed sensing (CS-SPACE).

    PubMed

    Zhu, Chengcheng; Tian, Bing; Chen, Luguang; Eisenmenger, Laura; Raithel, Esther; Forman, Christoph; Ahn, Sinyeob; Laub, Gerhard; Liu, Qi; Lu, Jianping; Liu, Jing; Hess, Christopher; Saloner, David

    2018-06-01

    Develop and optimize an accelerated, high-resolution (0.5 mm isotropic) 3D black blood MRI technique to reduce scan time for whole-brain intracranial vessel wall imaging. A 3D accelerated T 1 -weighted fast-spin-echo prototype sequence using compressed sensing (CS-SPACE) was developed at 3T. Both the acquisition [echo train length (ETL), under-sampling factor] and reconstruction parameters (regularization parameter, number of iterations) were first optimized in 5 healthy volunteers. Ten patients with a variety of intracranial vascular disease presentations (aneurysm, atherosclerosis, dissection, vasculitis) were imaged with SPACE and optimized CS-SPACE, pre and post Gd contrast. Lumen/wall area, wall-to-lumen contrast ratio (CR), enhancement ratio (ER), sharpness, and qualitative scores (1-4) by two radiologists were recorded. The optimized CS-SPACE protocol has ETL 60, 20% k-space under-sampling, 0.002 regularization factor with 20 iterations. In patient studies, CS-SPACE and conventional SPACE had comparable image scores both pre- (3.35 ± 0.85 vs. 3.54 ± 0.65, p = 0.13) and post-contrast (3.72 ± 0.58 vs. 3.53 ± 0.57, p = 0.15), but the CS-SPACE acquisition was 37% faster (6:48 vs. 10:50). CS-SPACE agreed with SPACE for lumen/wall area, ER measurements and sharpness, but marginally reduced the CR. In the evaluation of intracranial vascular disease, CS-SPACE provides a substantial reduction in scan time compared to conventional T 1 -weighted SPACE while maintaining good image quality.

  16. Quantum image encryption based on restricted geometric and color transformations

    NASA Astrophysics Data System (ADS)

    Song, Xian-Hua; Wang, Shen; Abd El-Latif, Ahmed A.; Niu, Xia-Mu

    2014-08-01

    A novel encryption scheme for quantum images based on restricted geometric and color transformations is proposed. The new strategy comprises efficient permutation and diffusion properties for quantum image encryption. The core idea of the permutation stage is to scramble the codes of the pixel positions through restricted geometric transformations. Then, a new quantum diffusion operation is implemented on the permutated quantum image based on restricted color transformations. The encryption keys of the two stages are generated by two sensitive chaotic maps, which can ensure the security of the scheme. The final step, measurement, is built by the probabilistic model. Experiments conducted on statistical analysis demonstrate that significant improvements in the results are in favor of the proposed approach.

  17. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Multiplexed phase-space imaging for 3D fluorescence microscopy.

    PubMed

    Liu, Hsiou-Yuan; Zhong, Jingshan; Waller, Laura

    2017-06-26

    Optical phase-space functions describe spatial and angular information simultaneously; examples of optical phase-space functions include light fields in ray optics and Wigner functions in wave optics. Measurement of phase-space enables digital refocusing, aberration removal and 3D reconstruction. High-resolution capture of 4D phase-space datasets is, however, challenging. Previous scanning approaches are slow, light inefficient and do not achieve diffraction-limited resolution. Here, we propose a multiplexed method that solves these problems. We use a spatial light modulator (SLM) in the pupil plane of a microscope in order to sequentially pattern multiplexed coded apertures while capturing images in real space. Then, we reconstruct the 3D fluorescence distribution of our sample by solving an inverse problem via regularized least squares with a proximal accelerated gradient descent solver. We experimentally reconstruct a 101 Megavoxel 3D volume (1010×510×500µm with NA 0.4), demonstrating improved acquisition time, light throughput and resolution compared to scanning aperture methods. Our flexible patterning scheme further allows sparsity in the sample to be exploited for reduced data capture.

  19. Augmenting the logrank test in the design of clinical trials in which non-proportional hazards of the treatment effect may be anticipated.

    PubMed

    Royston, Patrick; Parmar, Mahesh K B

    2016-02-11

    Most randomized controlled trials with a time-to-event outcome are designed assuming proportional hazards (PH) of the treatment effect. The sample size calculation is based on a logrank test. However, non-proportional hazards are increasingly common. At analysis, the estimated hazards ratio with a confidence interval is usually presented. The estimate is often obtained from a Cox PH model with treatment as a covariate. If non-proportional hazards are present, the logrank and equivalent Cox tests may lose power. To safeguard power, we previously suggested a 'joint test' combining the Cox test with a test of non-proportional hazards. Unfortunately, a larger sample size is needed to preserve power under PH. Here, we describe a novel test that unites the Cox test with a permutation test based on restricted mean survival time. We propose a combined hypothesis test based on a permutation test of the difference in restricted mean survival time across time. The test involves the minimum of the Cox and permutation test P-values. We approximate its null distribution and correct it for correlation between the two P-values. Using extensive simulations, we assess the type 1 error and power of the combined test under several scenarios and compare with other tests. We investigate powering a trial using the combined test. The type 1 error of the combined test is close to nominal. Power under proportional hazards is slightly lower than for the Cox test. Enhanced power is available when the treatment difference shows an 'early effect', an initial separation of survival curves which diminishes over time. The power is reduced under a 'late effect', when little or no difference in survival curves is seen for an initial period and then a late separation occurs. We propose a method of powering a trial using the combined test. The 'insurance premium' offered by the combined test to safeguard power under non-PH represents about a single-digit percentage increase in sample size. The combined test increases trial power under an early treatment effect and protects power under other scenarios. Use of restricted mean survival time facilitates testing and displaying a generalized treatment effect.

  20. Diagnostic index of 3D osteoarthritic changes in TMJ condylar morphology

    NASA Astrophysics Data System (ADS)

    Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João. Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia

    2015-03-01

    The aim of this study was to investigate imaging statistical approaches for classifying 3D osteoarthritic morphological variations among 169 Temporomandibular Joint (TMJ) condyles. Cone beam Computed Tomography (CBCT) scans were acquired from 69 patients with long-term TMJ Osteoarthritis (OA) (39.1 ± 15.7 years), 15 patients at initial diagnosis of OA (44.9 ± 14.8 years) and 7 healthy controls (43 ± 12.4 years). 3D surface models of the condyles were constructed and Shape Correspondence was used to establish correspondent points on each model. The statistical framework included a multivariate analysis of covariance (MANCOVA) and Direction-Projection- Permutation (DiProPerm) for testing statistical significance of the differences between healthy control and the OA group determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering (HAC) was then conducted. Condylar morphology in OA and healthy subjects varied widely. Compared with healthy controls, OA average condyle was statistically significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis (p < 0.05). It was observed areas of 3.88 mm bone resorption at the superior surface and 3.10 mm bone apposition at the anterior aspect of the long-term OA average model. 1000 permutation statistics of DiProPerm supported a significant difference between the healthy control group and OA group (t = 6.7, empirical p-value = 0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition.

  1. Nicotine deprivation elevates neural representation of smoking-related cues in object-sensitive visual cortex: a proof of concept study.

    PubMed

    Havermans, Anne; van Schayck, Onno C P; Vuurman, Eric F P M; Riedel, Wim J; van den Hurk, Job

    2017-08-01

    In the current study, we use functional magnetic resonance imaging (fMRI) and multi-voxel pattern analysis (MVPA) to investigate whether tobacco addiction biases basic visual processing in favour of smoking-related images. We hypothesize that the neural representation of smoking-related stimuli in the lateral occipital complex (LOC) is elevated after a period of nicotine deprivation compared to a satiated state, but that this is not the case for object categories unrelated to smoking. Current smokers (≥10 cigarettes a day) underwent two fMRI scanning sessions: one after 10 h of nicotine abstinence and the other one after smoking ad libitum. Regional blood oxygenated level-dependent (BOLD) response was measured while participants were presented with 24 blocks of 8 colour-matched pictures of cigarettes, pencils or chairs. The functional data of 10 participants were analysed through a pattern classification approach. In bilateral LOC clusters, the classifier was able to discriminate between patterns of activity elicited by visually similar smoking-related (cigarettes) and neutral objects (pencils) above empirically estimated chance levels only during deprivation (mean = 61.0%, chance (permutations) = 50.0%, p = .01) but not during satiation (mean = 53.5%, chance (permutations) = 49.9%, ns.). For all other stimulus contrasts, there was no difference in discriminability between the deprived and satiated conditions. The discriminability between smoking and non-smoking visual objects was elevated in object-selective brain region LOC after a period of nicotine abstinence. This indicates that attention bias likely affects basic visual object processing.

  2. Permutation testing of orthogonal factorial effects in a language-processing experiment using fMRI.

    PubMed

    Suckling, John; Davis, Matthew H; Ooi, Cinly; Wink, Alle Meije; Fadili, Jalal; Salvador, Raymond; Welchew, David; Sendur, Levent; Maxim, Vochita; Bullmore, Edward T

    2006-05-01

    The block-paradigm of the Functional Image Analysis Contest (FIAC) dataset was analysed with the Brain Activation and Morphological Mapping software. Permutation methods in the wavelet domain were used for inference on cluster-based test statistics of orthogonal contrasts relevant to the factorial design of the study, namely: the average response across all active blocks, the main effect of speaker, the main effect of sentence, and the interaction between sentence and speaker. Extensive activation was seen with all these contrasts. In particular, different vs. same-speaker blocks produced elevated activation in bilateral regions of the superior temporal lobe and repetition suppression for linguistic materials (same vs. different-sentence blocks) in left inferior frontal regions. These are regions previously reported in the literature. Additional regions were detected in this study, perhaps due to the enhanced sensitivity of the methodology. Within-block sentence suppression was tested post-hoc by regression of an exponential decay model onto the extracted time series from the left inferior frontal gyrus, but no strong evidence of such an effect was found. The significance levels set for the activation maps are P-values at which we expect <1 false-positive cluster per image. Nominal type I error control was verified by empirical testing of a test statistic corresponding to a randomly ordered design matrix. The small size of the BOLD effect necessitates sensitive methods of detection of brain activation. Permutation methods permit the necessary flexibility to develop novel test statistics to meet this challenge.

  3. Automatic correction of echo-planar imaging (EPI) ghosting artifacts in real-time interactive cardiac MRI using sensitivity encoding.

    PubMed

    Kim, Yoon-Chul; Nielsen, Jon-Fredrik; Nayak, Krishna S

    2008-01-01

    To develop a method that automatically corrects ghosting artifacts due to echo-misalignment in interleaved gradient-echo echo-planar imaging (EPI) in arbitrary oblique or double-oblique scan planes. An automatic ghosting correction technique was developed based on an alternating EPI acquisition and the phased-array ghost elimination (PAGE) reconstruction method. The direction of k-space traversal is alternated at every temporal frame, enabling lower temporal-resolution ghost-free coil sensitivity maps to be dynamically estimated. The proposed method was compared with conventional one-dimensional (1D) phase correction in axial, oblique, and double-oblique scan planes in phantom and cardiac in vivo studies. The proposed method was also used in conjunction with two-fold acceleration. The proposed method with nonaccelerated acquisition provided excellent suppression of ghosting artifacts in all scan planes, and was substantially more effective than conventional 1D phase correction in oblique and double-oblique scan planes. The feasibility of real-time reconstruction using the proposed technique was demonstrated in a scan protocol with 3.1-mm spatial and 60-msec temporal resolution. The proposed technique with nonaccelerated acquisition provides excellent ghost suppression in arbitrary scan orientations without a calibration scan, and can be useful for real-time interactive imaging, in which scan planes are frequently changed with arbitrary oblique orientations.

  4. Point form relativistic quantum mechanics and relativistic SU(6)

    NASA Technical Reports Server (NTRS)

    Klink, W. H.

    1993-01-01

    The point form is used as a framework for formulating a relativistic quantum mechanics, with the mass operator carrying the interactions of underlying constituents. A symplectic Lie algebra of mass operators is introduced from which a relativistic harmonic oscillator mass operator is formed. Mass splittings within the degenerate harmonic oscillator levels arise from relativistically invariant spin-spin, spin-orbit, and tensor mass operators. Internal flavor (and color) symmetries are introduced which make it possible to formulate a relativistic SU(6) model of baryons (and mesons). Careful attention is paid to the permutation symmetry properties of the hadronic wave functions, which are written as polynomials in Bargmann spaces.

  5. Speech Privacy Problems

    DTIC Science & Technology

    1945-08-18

    were interconnected, how? ever,, it was found that one of the oscillators had an intermit - tent, defect-. This trouble was cleared by removing the...switches> i.e., two pairsL are included in.the unit," one of ä pair of selectors {the " fast selector") steps .each time the latch operates, the other (the...34slow seleotor") steps oiice eaoh time the fast seleotor completes 25 steps. Thus, a total of 625 steps, or changes in permutation, is involved be

  6. Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations

    DTIC Science & Technology

    2014-09-18

    entropy . At the same time, researchers strive to enhance AES and mitigate these growing threats. This paper researches the extension of existing...the algorithm or use side channels to reduce entropy , such as Differential Fault Analysis (DFA). At the same time, continuing research strives to...the state matrix. The S-box is an 8-bit 16x16 table built from an affine transformation on multiplicative inverses which guarantees full permutation (S

  7. CUSUM method for construction of trainee spinal ultrasound learning curves following standardised teaching.

    PubMed

    Deacon, A J; Melhuishi, N S; Terblanche, N C S

    2014-07-01

    Spinal ultrasonography is a promising aid for epidural insertion. We aimed to determine the learning curve of spinal ultrasonography tasks and the number of training scans required to reach competency after undergoing standardised step-wise teaching. Trainees were required to complete a minimum of 60 assessed scans on selected non-pregnant models following attendance at two training sessions, with feedback from an expert after each scan. Learning curves were plotted using the non-risk cumulative summation technique and an acceptable failure rate of 20%. Five trainees completed between 65 and 75 scans each. All trainees were competent at identifying a randomly assigned intervertebral space after a median of five scans (range one to nine) and at measuring the depth from skin to the posterior complex after a median of 10 scans (range 1 to 42). Two trainees were competent at marking an ideal needle insertion point after 55 scans, while three trainees did not attain competency. All trainees were competent after 60 scans if the tolerance was changed from five to eight millimetre for marking the needle insertion point. The average time taken to complete a scan was 163 seconds. Our study showed that after a standardised educational intervention, anaesthetic trainees are able to identify a lumbar interlaminar space easily and can measure the depth to the posterior complex after a reasonable number of additional practice scans, but experienced difficulty accurately marking the needle insertion point whilst using spinal ultrasonography. We confirmed that it was hard to achieve competency in all aspects of spinal ultrasonography, based on assessment using our predefined competency criteria.

  8. Technology. Part 2

    NASA Technical Reports Server (NTRS)

    1997-01-01

    In this session, Session WP3, the discussion focuses on the following topics: Monitoring Physiological Variables With Membrane Probes; Real Time Confocal Laser Scanning Microscopy, Potential Applications in Space Medicine and Cell Biology; Optimum Versus Universal Planetary and Interplanetary Habitats; Application of Remote Sensing and Geographic Information System Technologies to the Prevention of Diarrheal Diseases in Nigeria; A Small G Loading Human Centrifuge for Space Station ERA; Use of the Bicycle Ergometer on the International Space Station and Its Influence On The Microgravity Environment; Munich Space Chair (MSC) - A Next Generation Body Restraint System for Astronauts; and Thermoelectric Human-Body Cooling Units Used By NASA Space Shuttle Astronauts.

  9. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains 40,738 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  10. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains, 40,661 entries that give increased access to he hierarchies in Volume 1 - Hierarchical Listing.

  11. Instability of Hierarchical Cluster Analysis Due to Input Order of the Data: The PermuCLUSTER Solution

    ERIC Educational Resources Information Center

    van der Kloot, Willem A.; Spaans, Alexander M. J.; Heiser, Willem J.

    2005-01-01

    Hierarchical agglomerative cluster analysis (HACA) may yield different solutions under permutations of the input order of the data. This instability is caused by ties, either in the initial proximity matrix or arising during agglomeration. The authors recommend to repeat the analysis on a large number of random permutations of the rows and columns…

  12. Optimal control of hybrid qubits: Implementing the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Rivera-Ruiz, C. M.; de Lima, E. F.; Fanchini, F. F.; Lopez-Richard, V.; Castelano, L. K.

    2018-03-01

    The optimal quantum control theory is employed to determine electric pulses capable of producing quantum gates with a fidelity higher than 0.9997, when noise is not taken into account. Particularly, these quantum gates were chosen to perform the permutation algorithm in hybrid qubits in double quantum dots (DQDs). The permutation algorithm is an oracle based quantum algorithm that solves the problem of the permutation parity faster than a classical algorithm without the necessity of entanglement between particles. The only requirement for achieving the speedup is the use of a one-particle quantum system with at least three levels. The high fidelity found in our results is closely related to the quantum speed limit, which is a measure of how fast a quantum state can be manipulated. Furthermore, we model charge noise by considering an average over the optimal field centered at different values of the reference detuning, which follows a Gaussian distribution. When the Gaussian spread is of the order of 5 μ eV (10% of the correct value), the fidelity is still higher than 0.95. Our scheme also can be used for the practical realization of different quantum algorithms in DQDs.

  13. PsiQuaSP-A library for efficient computation of symmetric open quantum systems.

    PubMed

    Gegg, Michael; Richter, Marten

    2017-11-24

    In a recent publication we showed that permutation symmetry reduces the numerical complexity of Lindblad quantum master equations for identical multi-level systems from exponential to polynomial scaling. This is important for open system dynamics including realistic system bath interactions and dephasing in, for instance, the Dicke model, multi-Λ system setups etc. Here we present an object-oriented C++ library that allows to setup and solve arbitrary quantum optical Lindblad master equations, especially those that are permutationally symmetric in the multi-level systems. PsiQuaSP (Permutation symmetry for identical Quantum Systems Package) uses the PETSc package for sparse linear algebra methods and differential equations as basis. The aim of PsiQuaSP is to provide flexible, storage efficient and scalable code while being as user friendly as possible. It is easily applied to many quantum optical or quantum information systems with more than one multi-level system. We first review the basics of the permutation symmetry for multi-level systems in quantum master equations. The application of PsiQuaSP to quantum dynamical problems is illustrated with several typical, simple examples of open quantum optical systems.

  14. Efficient and Robust Signal Approximations

    DTIC Science & Technology

    2009-05-01

    otherwise. Remark. Permutation matrices are both orthogonal and doubly- stochastic [62]. We will now show how to further simplify the Robust Coding...reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: signal processing, image compression, independent component analysis , sparse

  15. New method for scanning spacecraft and balloon-borne/space-based experiments

    NASA Technical Reports Server (NTRS)

    Polites, Michael E.

    1991-01-01

    A new method is presented for scanning balloon-borne experiments, free-flying spacecraft, and gimballed experiments mounted to the space shuttle or the space station. It uses rotating-unbalanced-mass (RUM) devices for generating circular, line, or raster scan patterns and an auxiliary control system for target acquisition, keeping the scan centered on the target, and producing complementary motion for raster scanning. It is ideal for applications where the only possible way to accomplish the required scan is to physically scan the entire experiment or spacecraft as in X-ray and gamma ray experiments. In such cases, this new method should have advantages over prior methods in terms of either power, weight, cost, performance, stability, or a combination of these.

  16. A vector scanning processing technique for pulsed laser velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Edwards, Robert V.

    1989-01-01

    Pulsed laser sheet velocimetry yields nonintrusive measurements of two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high precision (1 pct) velocity estimates, but can require several hours of processing time on specialized array processors. Under some circumstances, a simple, fast, less accurate (approx. 5 pct), data reduction technique which also gives unambiguous velocity vector information is acceptable. A direct space domain processing technique was examined. The direct space domain processing technique was found to be far superior to any other techniques known, in achieving the objectives listed above. It employs a new data coding and reduction technique, where the particle time history information is used directly. Further, it has no 180 deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 minutes on an 80386 based PC, producing a 2-D velocity vector map of the flow field. Hence, using this new space domain vector scanning (VS) technique, pulsed laser velocimetry data can be reduced quickly and reasonably accurately, without specialized array processing hardware.

  17. Controllability of symmetric spin networks

    NASA Astrophysics Data System (ADS)

    Albertini, Francesca; D'Alessandro, Domenico

    2018-05-01

    We consider a network of n spin 1/2 systems which are pairwise interacting via Ising interaction and are controlled by the same electro-magnetic control field. Such a system presents symmetries since the Hamiltonian is unchanged if we permute two spins. This prevents full (operator) controllability, in that not every unitary evolution can be obtained. We prove however that controllability is verified if we restrict ourselves to unitary evolutions which preserve the above permutation invariance. For low dimensional cases, n = 2 and n = 3, we provide an analysis of the Lie group of available evolutions and give explicit control laws to transfer between two arbitrary permutation invariant states. This class of states includes highly entangled states such as Greenberger-Horne-Zeilinger (GHZ) states and W states, which are of interest in quantum information.

  18. A permutation information theory tour through different interest rate maturities: the Libor case.

    PubMed

    Bariviera, Aurelio Fernández; Guercio, María Belén; Martinez, Lisana B; Rosso, Osvaldo A

    2015-12-13

    This paper analyses Libor interest rates for seven different maturities and referred to operations in British pounds, euros, Swiss francs and Japanese yen, during the period 2001-2015. The analysis is performed by means of two quantifiers derived from information theory: the permutation Shannon entropy and the permutation Fisher information measure. An anomalous behaviour in the Libor is detected in all currencies except euros during the years 2006-2012. The stochastic switch is more severe in one, two and three months maturities. Given the special mechanism of Libor setting, we conjecture that the behaviour could have been produced by the manipulation that was uncovered by financial authorities. We argue that our methodology is pertinent as a market overseeing instrument. © 2015 The Author(s).

  19. Storage and computationally efficient permutations of factorized covariance and square-root information matrices

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector-stored upper-triangular diagonal factorized covariance (UD) and vector stored upper-triangular square-root information filter (SRIF) arrays is presented. The method involves cyclical permutation of the rows and columns of the arrays and retriangularization with appropriate square-root-free fast Givens rotations or elementary slow Givens reflections. A minimal amount of computation is performed and only one scratch vector of size N is required, where N is the column dimension of the arrays. To make the method efficient for large SRIF arrays on a virtual memory machine, three additional scratch vectors each of size N are used to avoid expensive paging faults. The method discussed is compared with the methods and routines of Bierman's Estimation Subroutine Library (ESL).

  20. Can Emotional and Behavioral Dysregulation in Youth Be Decoded from Functional Neuroimaging?

    PubMed

    Portugal, Liana C L; Rosa, Maria João; Rao, Anil; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Perlman, Susan B; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Ciuffetelli, Gary; Rodriguez, Eric; Forbes, Erika E; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, Eugene L; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Pereira, Mirtes; Oliveira, Leticia; Phillips, Mary L; Mourao-Miranda, Janaina

    2016-01-01

    High comorbidity among pediatric disorders characterized by behavioral and emotional dysregulation poses problems for diagnosis and treatment, and suggests that these disorders may be better conceptualized as dimensions of abnormal behaviors. Furthermore, identifying neuroimaging biomarkers related to dimensional measures of behavior may provide targets to guide individualized treatment. We aimed to use functional neuroimaging and pattern regression techniques to determine whether patterns of brain activity could accurately decode individual-level severity on a dimensional scale measuring behavioural and emotional dysregulation at two different time points. A sample of fifty-seven youth (mean age: 14.5 years; 32 males) was selected from a multi-site study of youth with parent-reported behavioral and emotional dysregulation. Participants performed a block-design reward paradigm during functional Magnetic Resonance Imaging (fMRI). Pattern regression analyses consisted of Relevance Vector Regression (RVR) and two cross-validation strategies implemented in the Pattern Recognition for Neuroimaging toolbox (PRoNTo). Medication was treated as a binary confounding variable. Decoded and actual clinical scores were compared using Pearson's correlation coefficient (r) and mean squared error (MSE) to evaluate the models. Permutation test was applied to estimate significance levels. Relevance Vector Regression identified patterns of neural activity associated with symptoms of behavioral and emotional dysregulation at the initial study screen and close to the fMRI scanning session. The correlation and the mean squared error between actual and decoded symptoms were significant at the initial study screen and close to the fMRI scanning session. However, after controlling for potential medication effects, results remained significant only for decoding symptoms at the initial study screen. Neural regions with the highest contribution to the pattern regression model included cerebellum, sensory-motor and fronto-limbic areas. The combination of pattern regression models and neuroimaging can help to determine the severity of behavioral and emotional dysregulation in youth at different time points.

  1. q-Space Upsampling Using x-q Space Regularization.

    PubMed

    Chen, Geng; Dong, Bin; Zhang, Yong; Shen, Dinggang; Yap, Pew-Thian

    2017-09-01

    Acquisition time in diffusion MRI increases with the number of diffusion-weighted images that need to be acquired. Particularly in clinical settings, scan time is limited and only a sparse coverage of the vast q -space is possible. In this paper, we show how non-local self-similar information in the x - q space of diffusion MRI data can be harnessed for q -space upsampling. More specifically, we establish the relationships between signal measurements in x - q space using a patch matching mechanism that caters to unstructured data. We then encode these relationships in a graph and use it to regularize an inverse problem associated with recovering a high q -space resolution dataset from its low-resolution counterpart. Experimental results indicate that the high-resolution datasets reconstructed using the proposed method exhibit greater quality, both quantitatively and qualitatively, than those obtained using conventional methods, such as interpolation using spherical radial basis functions (SRBFs).

  2. Short-term capture of the Earth-Moon system

    NASA Astrophysics Data System (ADS)

    Qi, Yi; de Ruiter, Anton

    2018-06-01

    In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.

  3. Random domain name and address mutation (RDAM) for thwarting reconnaissance attacks

    PubMed Central

    Chen, Xi; Zhu, Yuefei

    2017-01-01

    Network address shuffling is a novel moving target defense (MTD) that invalidates the address information collected by the attacker by dynamically changing or remapping the host’s network addresses. However, most network address shuffling methods are limited by the limited address space and rely on the host’s static domain name to map to its dynamic address; therefore these methods cannot effectively defend against random scanning attacks, and cannot defend against an attacker who knows the target’s domain name. In this paper, we propose a network defense method based on random domain name and address mutation (RDAM), which increases the scanning space of the attacker through a dynamic domain name method and reduces the probability that a host will be hit by an attacker scanning IP addresses using the domain name system (DNS) query list and the time window methods. Theoretical analysis and experimental results show that RDAM can defend against scanning attacks and worm propagation more effectively than general network address shuffling methods, while introducing an acceptable operational overhead. PMID:28489910

  4. Near-Space TOPSAR Large-Scene Full-Aperture Imaging Scheme Based on Two-Step Processing

    PubMed Central

    Zhang, Qianghui; Wu, Junjie; Li, Wenchao; Huang, Yulin; Yang, Jianyu; Yang, Haiguang

    2016-01-01

    Free of the constraints of orbit mechanisms, weather conditions and minimum antenna area, synthetic aperture radar (SAR) equipped on near-space platform is more suitable for sustained large-scene imaging compared with the spaceborne and airborne counterparts. Terrain observation by progressive scans (TOPS), which is a novel wide-swath imaging mode and allows the beam of SAR to scan along the azimuth, can reduce the time of echo acquisition for large scene. Thus, near-space TOPS-mode SAR (NS-TOPSAR) provides a new opportunity for sustained large-scene imaging. An efficient full-aperture imaging scheme for NS-TOPSAR is proposed in this paper. In this scheme, firstly, two-step processing (TSP) is adopted to eliminate the Doppler aliasing of the echo. Then, the data is focused in two-dimensional frequency domain (FD) based on Stolt interpolation. Finally, a modified TSP (MTSP) is performed to remove the azimuth aliasing. Simulations are presented to demonstrate the validity of the proposed imaging scheme for near-space large-scene imaging application. PMID:27472341

  5. Tracker implementation for the orbiter Ku-band communications antenna

    NASA Technical Reports Server (NTRS)

    Rudnicki, J. F.; Lindsey, J. F.

    1976-01-01

    Possible implementations and recommendations for the Space Shuttle Ku-Band integrated communications/radar antenna tracking system were evaluated. Communication aspects involving the Tracking Data Relay Satellite (TDRS)/Orbiter Ku-Band link are emphasized. Detailed analysis of antenna sizes, gains and signal-to-noise ratios shows the desirability of using maximum size 36-inch diameter dish and a triple channel monopulse. The use of the original baselined 20 inch dish is found to result in excessive acquisition time since the despread signal would be used in the tracking loop. An evaluation of scan procedures which includes vehicle dynamics, designation error, time for acquisition and probability of acquisition shows that the conical scan is preferred since the time for lock-on for relatively slow look angle rates will be significantly shorter than the raster scan. Significant improvement in spherical coverage may be obtained by reorienting the antenna gimbal to obtain maximum blockage overlap.

  6. Complete measurement of spatiotemporally complex multi-spatial-mode ultrashort pulses from multimode optical fibers using delay-scanned wavelength-multiplexed holography.

    PubMed

    Zhu, Ping; Jafari, Rana; Jones, Travis; Trebino, Rick

    2017-10-02

    We introduce a simple delay-scanned complete spatiotemporal intensity-and-phase measurement technique based on wavelength-multiplexed holography to characterize long, complex pulses in space and time. We demonstrate it using pulses emerging from multi-mode fiber. This technique extends the temporal range and spectral resolution of the single-frame STRIPED FISH technique without using an otherwise-required expensive ultranarrow-bandpass filter. With this technique, we measured the complete intensity and phase of up to ten fiber modes from a multi-mode fiber (normalized frequency V ≈10) over a ~3ps time range. Spatiotemporal complexities such as intermodal delay, modal dispersion, and material dispersion were also intuitively displayed by the retrieved results. Agreement between the reconstructed color movies and the monitored time-averaged spatial profiles confirms the validity to this delay-scanned STRIPED FISH method.

  7. Effects of an Approach Spacing Flight Deck Tool on Pilot Eyescan

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa M.; Nadler, Eric D.

    2004-01-01

    An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. Eyetracker data showed only slight changes in instrument scan patterns, and no significant change in the amount of time spent looking out the window with ATAAS, versus standard ILS procedures.

  8. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The access vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries and pseudo-multiword terms that are permutations of words that contain words within words. The access vocabulary contains almost 42,000 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  9. Genomic Analysis of Complex Microbial Communities in Wounds

    DTIC Science & Technology

    2012-01-01

    thoroughly in the ecology literature. Permutation Multivariate Analysis of Variance ( PerMANOVA ). We used PerMANOVA to test the null-hypothesis of no...difference between the bacterial communities found within a single wound compared to those from different patients (α = 0.05). PerMANOVA is a...permutation-based version of the multivariate analysis of variance (MANOVA). PerMANOVA uses the distances between samples to partition variance and

  10. Circular permutation of the starch-binding domain: inversion of ligand selectivity with increased affinity.

    PubMed

    Stephen, Preyesh; Tseng, Kai-Li; Liu, Yu-Nan; Lyu, Ping-Chiang

    2012-03-07

    Proteins containing starch-binding domains (SBDs) are used in a variety of scientific and technological applications. A circularly permutated SBD (CP90) with improved affinity and selectivity toward longer-chain carbohydrates was synthesized, suggesting that a new starch-binding protein may be developed for specific scientific and industrial applications. This journal is © The Royal Society of Chemistry 2012

  11. The Effect on Moderate Altitude UPON Human Gastric Emptying Time

    DTIC Science & Technology

    1952-03-01

    physiological aliment . The emptying time, therefore, of a mixture of barium and food may perhaps differ somewhat from that of food alone. Determination of the...permutations of the four runs were tried. Table III, then, lends credence to the view that it was the subject’s apprehension at being a "human guinea ... pig " that was responsible for the prolongation of the initial runs and for some of Jhe deviation between duplicates. The experience of Van Liere and

  12. Application of Mathematical Signal Processing Techniques to Mission Systems. (l’Application des techniques mathematiques du traitement du signal aux systemes de conduite des missions)

    DTIC Science & Technology

    1999-11-01

    represents the linear time invariant (LTI) response of the combined analysis /synthesis system while the second repre- sents the aliasing introduced into...effectively to implement voice scrambling systems based on time - frequency permutation . The most general form of such a system is shown in Fig. 22 where...92201 NEUILLY-SUR-SEINE CEDEX, FRANCE RTO LECTURE SERIES 216 Application of Mathematical Signal Processing Techniques to Mission Systems (1

  13. Lichens survive in space: results from the 2005 LICHENS experiment.

    PubMed

    Sancho, Leopoldo G; de la Torre, Rosa; Horneck, Gerda; Ascaso, Carmen; de Los Rios, Asunción; Pintado, Ana; Wierzchos, J; Schuster, M

    2007-06-01

    This experiment was aimed at establishing, for the first time, the survival capability of lichens exposed to space conditions. In particular, the damaging effect of various wavelengths of extraterrestrial solar UV radiation was studied. The lichens used were the bipolar species Rhizocarpon geographicum and Xanthoria elegans, which were collected above 2000 m in the mountains of central Spain and as endolithic communities inhabiting granites in the Antarctic Dry Valleys. Lichens were exposed to space in the BIOPAN-5 facility of the European Space Agency; BIOPAN-5 is located on the outer shell of the Earth-orbiting FOTON-M2 Russian satellite. The lichen samples were launched from Baikonur by a Soyuz rocket on May 31, 2005, and were returned to Earth after 16 days in space, at which time they were tested for survival. Chlorophyll fluorescence was used for the measurement of photosynthetic parameters. Scanning electron microscopy in back-scattered mode, low temperature scanning electron microscopy, and transmission electron microscopy were used to study the organization and composition of both symbionts. Confocal laser scanning microscopy, in combination with the use of specific fluorescent probes, allowed for the assessment of the physiological state of the cells. All exposed lichens, regardless of the optical filters used, showed nearly the same photosynthetic activity after the flight as measured before the flight. Likewise, the multimicroscopy approach revealed no detectable ultrastructural changes in most of the algal and fungal cells of the lichen thalli, though a greater proportion of cells in the flight samples had compromised membranes, as revealed by the LIVE/DEAD BacLight Bacterial Viability Kit. These findings indicate that most lichenized fungal and algal cells can survive in space after full exposure to massive UV and cosmic radiation, conditions proven to be lethal to bacteria and other microorganisms. The lichen upper cortex seems to provide adequate protection against solar radiation. Moreover, after extreme dehydration induced by high vacuum, the lichens proved to be able to recover, in full, their metabolic activity within 24 hours.

  14. Lichens Survive in Space: Results from the 2005 LICHENS Experiment

    NASA Astrophysics Data System (ADS)

    Sancho, Leopoldo G.; de la Torre, Rosa; Horneck, Gerda; Ascaso, Carmen; de los Rios, Asunción; Pintado, Ana; Wierzchos, J.; Schuster, M.

    2007-06-01

    This experiment was aimed at establishing, for the first time, the survival capability of lichens exposed to space conditions. In particular, the damaging effect of various wavelengths of extraterrestrial solar UV radiation was studied. The lichens used were the bipolar species Rhizocarpon geographicum and Xanthoria elegans, which were collected above 2000 m in the mountains of central Spain and as endolithic communities inhabiting granites in the Antarctic Dry Valleys. Lichens were exposed to space in the BIOPAN-5 facility of the European Space Agency; BIOPAN-5 is located on the outer shell of the Earth-orbiting FOTON-M2 Russian satellite. The lichen samples were launched from Baikonur by a Soyuz rocket on May 31, 2005, and were returned to Earth after 16 days in space, at which time they were tested for survival. Chlorophyll fluorescence was used for the measurement of photosynthetic parameters. Scanning electron microscopy in back-scattered mode, low temperature scanning electron microscopy, and transmission electron microscopy were used to study the organization and composition of both symbionts. Confocal laser scanning microscopy, in combination with the use of specific fluorescent probes, allowed for the assessment of the physiological state of the cells. All exposed lichens, regardless of the optical filters used, showed nearly the same photosynthetic activity after the flight as measured before the flight. Likewise, the multimicroscopy approach revealed no detectable ultrastructural changes in most of the algal and fungal cells of the lichen thalli, though a greater proportion of cells in the flight samples had compromised membranes, as revealed by the LIVE/DEAD BacLight Bacterial Viability Kit. These findings indicate that most lichenized fungal and algal cells can survive in space after full exposure to massive UV and cosmic radiation, conditions proven to be lethal to bacteria and other microorganisms. The lichen upper cortex seems to provide adequate protection against solar radiation. Moreover, after extreme dehydration induced by high vacuum, the lichens proved to be able to recover, in full, their metabolic activity within 24 hours.

  15. Measurement of sea ice backscatter characteristics at 36 GHz using the surface contour radar

    NASA Technical Reports Server (NTRS)

    Fedor, L. S.; Walsh, E. J.

    1985-01-01

    Scattering studies of sea ice off the coast of Greenland were performed in January 1984 using the 36-GHz Surface Contour Radar (SCR) aboard the NASA P-3 aircraft. An oscillating mirror scans an actual half-power width of 0.96 degrees laterally to measure the surface at 51 evenly spaced points. By banking the aircraft, real-time topographical mapping and relative backscattered power are obtained at incidence angles between 0 and 30 degrees off-nadar, achieving at 175 m altitude a 2.9 by 4.4 m spatial resolution at nadir. With an aircraft ground speed of 100 m/s, 5-m successive scan line spacing and 1.8-m cross-track direction spacing is provided. By circling the aircraft in the 15 degree bank, the azimuthal anisotropy of the scattering is investigated along with the incidence angle dependence.

  16. Automated eye blink detection and correction method for clinical MR eye imaging.

    PubMed

    Wezel, Joep; Garpebring, Anders; Webb, Andrew G; van Osch, Matthias J P; Beenakker, Jan-Willem M

    2017-07-01

    To implement an on-line monitoring system to detect eye blinks during ocular MRI using field probes, and to reacquire corrupted k-space lines by means of an automatic feedback system integrated with the MR scanner. Six healthy subjects were scanned on a 7 Tesla MRI whole-body system using a custom-built receive coil. Subjects were asked to blink multiple times during the MR-scan. The local magnetic field changes were detected with an external fluorine-based field probe which was positioned close to the eye. The eye blink produces a field shift greater than a threshold level, this was communicated in real-time to the MR system which immediately reacquired the motion-corrupted k-space lines. The uncorrected images, using the original motion-corrupted data, showed severe artifacts, whereas the corrected images, using the reacquired data, provided an image quality similar to images acquired without blinks. Field probes can successfully detect eye blinks during MRI scans. By automatically reacquiring the eye blink-corrupted data, high quality MR-images of the eye can be acquired. Magn Reson Med 78:165-171, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  17. Development of a protocol to quantify local bone adaptation over space and time: Quantification of reproducibility.

    PubMed

    Lu, Yongtao; Boudiffa, Maya; Dall'Ara, Enrico; Bellantuono, Ilaria; Viceconti, Marco

    2016-07-05

    In vivo micro-computed tomography (µCT) scanning of small rodents is a powerful method for longitudinal monitoring of bone adaptation. However, the life-time bone growth in small rodents makes it a challenge to quantify local bone adaptation. Therefore, the aim of this study was to develop a protocol, which can take into account large bone growth, to quantify local bone adaptations over space and time. The entire right tibiae of eight 14-week-old C57BL/6J female mice were consecutively scanned four times in an in vivo µCT scanner using a nominal isotropic image voxel size of 10.4µm. The repeated scan image datasets were aligned to the corresponding baseline (first) scan image dataset using rigid registration. 80% of tibia length (starting from the endpoint of the proximal growth plate) was selected as the volume of interest and partitioned into 40 regions along the tibial long axis (10 divisions) and in the cross-section (4 sectors). The bone mineral content (BMC) was used to quantify bone adaptation and was calculated in each region. All local BMCs have precision errors (PE%CV) of less than 3.5% (24 out of 40 regions have PE%CV of less than 2%), least significant changes (LSCs) of less than 3.8%, and 38 out of 40 regions have intraclass correlation coefficients (ICCs) of over 0.8. The proposed protocol allows to quantify local bone adaptations over an entire tibia in longitudinal studies, with a high reproducibility, an essential requirement to reduce the number of animals to achieve the necessary statistical power. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Diagnostic criteria for multiple sclerosis: 2010 Revisions to the McDonald criteria

    PubMed Central

    Polman, Chris H; Reingold, Stephen C; Banwell, Brenda; Clanet, Michel; Cohen, Jeffrey A; Filippi, Massimo; Fujihara, Kazuo; Havrdova, Eva; Hutchinson, Michael; Kappos, Ludwig; Lublin, Fred D; Montalban, Xavier; O'Connor, Paul; Sandberg-Wollheim, Magnhild; Thompson, Alan J; Waubant, Emmanuelle; Weinshenker, Brian; Wolinsky, Jerry S

    2011-01-01

    New evidence and consensus has led to further revision of the McDonald Criteria for diagnosis of multiple sclerosis. The use of imaging for demonstration of dissemination of central nervous system lesions in space and time has been simplified, and in some circumstances dissemination in space and time can be established by a single scan. These revisions simplify the Criteria, preserve their diagnostic sensitivity and specificity, address their applicability across populations, and may allow earlier diagnosis and more uniform and widespread use. Ann Neurol 2011 PMID:21387374

  19. Quantum integrable systems from conformal blocks

    NASA Astrophysics Data System (ADS)

    Chen, Heng-Yu; Qualls, Joshua D.

    2017-05-01

    In this note, we extend the striking connections between quantum integrable systems and conformal blocks recently found in [M. Isachenkov and V. Schomerus, Phys. Rev. Lett. 117, 071602 (2016), 10.1103/PhysRevLett.117.071602] in several directions. First, we explicitly demonstrate that the action of the quartic conformal Casimir operator on general d-dimensional scalar conformal blocks can be expressed in terms of certain combinations of commuting integrals of motions of the two particle hyperbolic BC2 Calogero-Sutherland system. The permutation and reflection properties of the underlying Dunkl operators play crucial roles in establishing such a connection. Next, we show that the scalar superconformal blocks in superconformal field theories (SCFTs) with four and eight supercharges and suitable chirality constraints can also be identified with the eigenfunctions of the same Calogero-Sutherland system; this demonstrates the universality of such a connection. Finally, we observe that the so-called "seed" conformal blocks for constructing four point functions for operators with arbitrary space-time spins in four-dimensional CFTs can also be linearly expanded in terms of Calogero-Sutherland eigenfunctions.

  20. On the efficiency of sovereign bond markets

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Fernández Bariviera, Aurelio; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2012-09-01

    The existence of memory in financial time series has been extensively studied for several stock markets around the world by means of different approaches. However, fixed income markets, i.e. those where corporate and sovereign bonds are traded, have been much less studied. We believe that, given the relevance of these markets, not only from the investors', but also from the issuers' point of view (government and firms), it is necessary to fill this gap in the literature. In this paper, we study the sovereign market efficiency of thirty bond indices of both developed and emerging countries, using an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This representation space allows us to establish an efficiency ranking of different markets and distinguish different bond market dynamics. We conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. Additionally, we find a correlation between permutation entropy, economic development and market size that could be of interest for policy makers and investors.

  1. How to think about indiscernible particles

    NASA Astrophysics Data System (ADS)

    Giglio, Daniel Joseph

    Permutation symmetries which arise in quantum mechanics pose an intriguing problem. It is not clear that particles which exhibit permutation symmetries (i.e. particles which are indiscernible, meaning that they can be swapped with each other without this yielding a new physical state) qualify as "objects" in any reasonable sense of the term. One solution to this puzzle, which I attribute to W.V. Quine, would have us eliminate such particles from our ontology altogether in order to circumvent the metaphysical vexations caused by permutation symmetries. In this essay I argue that Quine's solution is too rash, and in its place I suggest a novel solution based on altering some of the language of quantum mechanics. Before launching into the technical details of indiscernible particles, however, I begin this essay with some remarks on the methodology -- instrumentalism -- which motivates my arguments.

  2. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  3. Testing and validation of multi-lidar scanning strategies for wind energy applications: Testing and validation of multi-lidar scanning strategies for wind energy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Bonin, Timothy A.; Klein, Petra M.

    Several factors cause lidars to measure different values of turbulence than an anemometer on a tower, including volume averaging, instrument noise, and the use of a scanning circle to estimate the wind field. One way to avoid the use of a scanning circle is to deploy multiple scanning lidars and point them toward the same volume in space to collect velocity measurements and extract high-resolution turbulence information. This paper explores the use of two multi-lidar scanning strategies, the tri-Doppler technique and the virtual tower technique, for measuring 3-D turbulence. In Summer 2013, a vertically profiling Leosphere WindCube lidar and threemore » Halo Photonics Streamline lidars were operated at the Southern Great Plains Atmospheric Radiation Measurement site to test these multi-lidar scanning strategies. During the first half of the field campaign, all three scanning lidars were pointed at approximately the same point in space and a tri-Doppler analysis was completed to calculate the three-dimensional wind vector every second. Next, all three scanning lidars were used to build a “virtual tower” above the WindCube lidar. Results indicate that the tri-Doppler technique measures higher values of horizontal turbulence than the WindCube lidar under stable atmospheric conditions, reduces variance contamination under unstable conditions, and can measure highresolution profiles of mean wind speed and direction. The virtual tower technique provides adequate turbulence information under stable conditions but cannot capture the full temporal variability of turbulence experienced under unstable conditions because of the time needed to readjust the scans.« less

  4. System for routing messages in a vertex symmetric network by using addresses formed from permutations of the transmission line indicees

    DOEpatents

    Faber, Vance; Moore, James W.

    1992-01-01

    A network of interconnected processors is formed from a vertex symmetric graph selected from graphs .GAMMA..sub.d (k) with degree d, diameter k, and (d+1)!/(d-k+1)! processors for each d.gtoreq.k and .GAMMA..sub.d (k,-1) with degree 3-1, diameter k+1, and (d+1)!/(d-k+1)! processors for each d.gtoreq.k.gtoreq.4. Each processor has an address formed by one of the permutations from a predetermined sequence of letters chosen a selected number of letters at a time, and an extended address formed by appending to the address the remaining ones of the predetermined sequence of letters. A plurality of transmission channels is provided from each of the processors, where each processor has one less channel than the selected number of letters forming the sequence. Where a network .GAMMA..sub.d (k,-1) is provided, no processor has a channel connected to form an edge in a direction .delta..sub.1. Each of the channels has an identification number selected from the sequence of letters and connected from a first processor having a first extended address to a second processor having a second address formed from a second extended address defined by moving to the front of the first extended address the letter found in the position within the first extended address defined by the channel identification number. The second address is then formed by selecting the first elements of the second extended address corresponding to the selected number used to form the address permutations.

  5. Dynamic Testing and Automatic Repair of Reconfigurable Wiring Harnesses

    DTIC Science & Technology

    2006-11-27

    Switch An M ×N grid of switches configured to provide a M -input, N -output routing network. Permutation Network A permutation network performs an...wiring reduces the effective advantage of their reduced switch count, particularly when considering that regular grids (crossbar switches being a...are connected to. The outline circuit shown in Fig. 20 shows how a suitable ‘discovery probe’ might be implemented. The circuit shows a UART

  6. Tolerance of a Knotted Near-Infrared Fluorescent Protein to Random Circular Permutation.

    PubMed

    Pandey, Naresh; Kuypers, Brianna E; Nassif, Barbara; Thomas, Emily E; Alnahhas, Razan N; Segatori, Laura; Silberg, Jonathan J

    2016-07-12

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFPs to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified 27 circularly permuted iRFPs that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants that initiated translation within the PAS and GAF domains were discovered. Circularly permuted iRFPs retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a quantum yield similar to that of iRFPs but exhibited increased resistance to chemical denaturation, suggesting that the observed increase in the magnitude of the signal arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step toward the creation of near-infrared biosensors with expanded chemical sensing functions for in vivo imaging.

  7. Tolerance of a knotted near infrared fluorescent protein to random circular permutation

    PubMed Central

    Pandey, Naresh; Kuypers, Brianna E.; Nassif, Barbara; Thomas, Emily E.; Alnahhas, Razan N.; Segatori, Laura; Silberg, Jonathan J.

    2016-01-01

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFP to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified twenty seven circularly permuted iRFP that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants were discovered that initiated translation within the PAS and GAF domains. Circularly permuted iRFP retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a similar quantum yield as iRFP, but exhibited increased resistance to chemical denaturation, suggesting that the observed signal increase arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step towards the creation of near-infrared biosensors with expanded chemical-sensing functions for in vivo imaging. PMID:27304983

  8. Plasma level-dependent effects of methylphenidate on task-related functional magnetic resonance imaging signal changes.

    PubMed

    Müller, Ulrich; Suckling, J; Zelaya, F; Honey, G; Faessel, H; Williams, S C R; Routledge, C; Brown, J; Robbins, T W; Bullmore, E T

    2005-08-01

    Methylphenidate (MPH) is a dopamine and noradrenaline enhancing drug used to treat attentional deficits. Understanding of its cognition-enhancing effects and the neurobiological mechanisms involved, especially in elderly people, is currently incomplete. The aim of this study was to investigate the relationship between MPH plasma levels and brain activation during visuospatial attention and movement preparation. Twelve healthy elderly volunteers were scanned twice using functional magnetic resonance imaging (fMRI) after oral administration of MPH 20 mg or placebo in a within-subject design. The cognitive paradigm was a four-choice reaction time task presented at two levels of difficulty (with and without spatial cue). Plasma MPH levels were measured at six time points between 30 and 205 min after dosing. FMRI data were analysed using a linear model to estimate physiological response to the task and nonparametric permutation tests for inference. Lateral premotor and medial posterior parietal cortical activation was increased by MPH, on average, over both levels of task difficulty. There was considerable intersubject variability in the pharmacokinetics of MPH. Greater area under the plasma concentration-time curve was positively correlated with strength of activation in motor and premotor cortex, temporoparietal cortex and caudate nucleus during the difficult version of the task. This is the first pharmacokinetic/pharmacodynamic study to find an association between plasma levels of MPH and its modulatory effects on brain activation measured using fMRI. The results suggest that catecholaminergic mechanisms may be important in brain adaptivity to task difficulty and in task-specific recruitment of spatial attention systems.

  9. A method for quantitative analysis of clump thickness in cervical cytology slides.

    PubMed

    Fan, Yilun; Bradley, Andrew P

    2016-01-01

    Knowledge of the spatial distribution and thickness of cytology specimens is critical to the development of digital slide acquisition techniques that minimise both scan times and image file size. In this paper, we evaluate a novel method to achieve this goal utilising an exhaustive high-resolution scan, an over-complete wavelet transform across multi-focal planes and a clump segmentation of all cellular materials on the slide. The method is demonstrated with a quantitative analysis of ten normal, but difficult to scan Pap stained, Thin-prep, cervical cytology slides. We show that with this method the top and bottom of the specimen can be estimated to an accuracy of 1 μm in 88% and 97% of the fields of view respectively. Overall, cellular material can be over 30 μm thick and the distribution of cells is skewed towards the cover-slip (top of the slide). However, the median clump thickness is 10 μm and only 31% of clumps contain more than three nuclei. Therefore, by finding a focal map of the specimen the number of 1 μm spaced focal planes that are required to be scanned to acquire 95% of the in-focus material can be reduced from 25.4 to 21.4 on average. In addition, we show that by considering the thickness of the specimen, an improved focal map can be produced which further reduces the required number of 1 μm spaced focal planes to 18.6. This has the potential to reduce scan times and raw image data by over 25%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. SBIR Technology Applications to Space Communications and Navigation (SCaN)

    NASA Technical Reports Server (NTRS)

    Liebrecht, Phil; Eblen, Pat; Rush, John; Tzinis, Irene

    2010-01-01

    This slide presentation reviews the mission of the Space Communications and Navigation (SCaN) Office with particular emphasis on opportunities for technology development with SBIR companies. The SCaN office manages NASA's space communications and navigation networks: the Near Earth Network (NEN), the Space Network (SN), and the Deep Space Network (DSN). The SCaN networks nodes are shown on a world wide map and the networks are described. Two types of technologies are described: Pull technology, and Push technologies. A listing of technology themes is presented, with a discussion on Software defined Radios, Optical Communications Technology, and Lunar Lasercom Space Terminal (LLST). Other technologies that are being investigated are some Game Changing Technologies (GCT) i.e., technologies that offer the potential for improving comm. or nav. performance to the point that radical new mission objectives are possible, such as Superconducting Quantum Interference Filters, Silicon Nanowire Optical Detectors, and Auto-Configuring Cognitive Communications

  11. Detecting space-time disease clusters with arbitrary shapes and sizes using a co-clustering approach.

    PubMed

    Ullah, Sami; Daud, Hanita; Dass, Sarat C; Khan, Habib Nawaz; Khalil, Alamgir

    2017-11-06

    Ability to detect potential space-time clusters in spatio-temporal data on disease occurrences is necessary for conducting surveillance and implementing disease prevention policies. Most existing techniques use geometrically shaped (circular, elliptical or square) scanning windows to discover disease clusters. In certain situations, where the disease occurrences tend to cluster in very irregularly shaped areas, these algorithms are not feasible in practise for the detection of space-time clusters. To address this problem, a new algorithm is proposed, which uses a co-clustering strategy to detect prospective and retrospective space-time disease clusters with no restriction on shape and size. The proposed method detects space-time disease clusters by tracking the changes in space-time occurrence structure instead of an in-depth search over space. This method was utilised to detect potential clusters in the annual and monthly malaria data in Khyber Pakhtunkhwa Province, Pakistan from 2012 to 2016 visualising the results on a heat map. The results of the annual data analysis showed that the most likely hotspot emerged in three sub-regions in the years 2013-2014. The most likely hotspots in monthly data appeared in the month of July to October in each year and showed a strong periodic trend.

  12. Laser speckle reduction due to spatial and angular diversity introduced by fast scanning micromirror.

    PubMed

    Akram, M Nadeem; Tong, Zhaomin; Ouyang, Guangmin; Chen, Xuyuan; Kartashov, Vladimir

    2010-06-10

    We utilize spatial and angular diversity to achieve speckle reduction in laser illumination. Both free-space and imaging geometry configurations are considered. A fast two-dimensional scanning micromirror is employed to steer the laser beam. A simple experimental setup is built to demonstrate the application of our technique in a two-dimensional laser picture projection. Experimental results show that the speckle contrast factor can be reduced down to 5% within the integration time of the detector.

  13. A High Resolution Genome-Wide Scan for Significant Selective Sweeps: An Application to Pooled Sequence Data in Laying Chickens

    PubMed Central

    Qanbari, Saber; Strom, Tim M.; Haberer, Georg; Weigend, Steffen; Gheyas, Almas A.; Turner, Frances; Burt, David W.; Preisinger, Rudolf; Gianola, Daniel; Simianer, Henner

    2012-01-01

    In most studies aimed at localizing footprints of past selection, outliers at tails of the empirical distribution of a given test statistic are assumed to reflect locus-specific selective forces. Significance cutoffs are subjectively determined, rather than being related to a clear set of hypotheses. Here, we define an empirical p-value for the summary statistic by means of a permutation method that uses the observed SNP structure in the real data. To illustrate the methodology, we applied our approach to a panel of 2.9 million autosomal SNPs identified from re-sequencing a pool of 15 individuals from a brown egg layer line. We scanned the genome for local reductions in heterozygosity, suggestive of selective sweeps. We also employed a modified sliding window approach that accounts for gaps in the sequence and increases scanning resolution by moving the overlapping windows by steps of one SNP only, and suggest to call this a “creeping window” strategy. The approach confirmed selective sweeps in the region of previously described candidate genes, i.e. TSHR, PRL, PRLHR, INSR, LEPR, IGF1, and NRAMP1 when used as positive controls. The genome scan revealed 82 distinct regions with strong evidence of selection (genome-wide p-value<0.001), including genes known to be associated with eggshell structure and immune system such as CALB1 and GAL cluster, respectively. A substantial proportion of signals was found in poor gene content regions including the most extreme signal on chromosome 1. The observation of multiple signals in a highly selected layer line of chicken is consistent with the hypothesis that egg production is a complex trait controlled by many genes. PMID:23209582

  14. Interplay effects in proton scanning for lung: a 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters.

    PubMed

    Dowdell, S; Grassberger, C; Sharp, G C; Paganetti, H

    2013-06-21

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of five lung cancer patients of varying tumor size (50.4-167.1 cc) and motion amplitude (2.9-30.1 mm). Treatments were planned assuming delivery in 35 × 2.5 Gy(RBE) fractions. The spot size, time to change the beam energy (τes), time required for magnet settling (τss), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the five patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior-inferior motion amplitude alone. Larger spot sizes (σ ~ 9-16 mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0 ± 4.4% (1 standard deviation) in a single fraction compared to 86.1 ± 13.1% for smaller spots (σ ~ 2-4 mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes respectively. The interplay effect is highly patient specific, depending on the motion amplitude, tumor location and the delivery parameters. Large degradations of the dose distribution in a single fraction were observed, but improved significantly using conventional fractionation.

  15. Interplay effects in proton scanning for lung: A 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters

    PubMed Central

    Dowdell, S; Grassberger, C; Sharp, G C; Paganetti, H

    2013-01-01

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of 5 lung cancer patients of varying tumor size (50.4–167.1cc) and motion amplitude (2.9–30.1mm). Treatments were planned assuming delivery in 35×2.5Gy(RBE) fractions. The spot size, time to change the beam energy (τes), time required for magnet settling (τss), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the 5 patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior-inferior (SI) motion amplitude alone. Larger spot sizes (σ ~9–16mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0±4.4% (1 standard deviation) in a single fraction compared to 86.1±13.1% for smaller spots (σ ~2–4mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes respectively. The interplay effect is highly patient specific, depending on the motion amplitude, tumor location and the delivery parameters. Large degradations of the dose distribution in a single fraction were observed, but improved significantly using conventional fractionation. PMID:23689035

  16. Pore-scale Simulation and Imaging of Multi-phase Flow and Transport in Porous Media (Invited)

    NASA Astrophysics Data System (ADS)

    Crawshaw, J.; Welch, N.; Daher, I.; Yang, J.; Shah, S.; Grey, F.; Boek, E.

    2013-12-01

    We combine multi-scale imaging and computer simulation of multi-phase flow and reactive transport in rock samples to enhance our fundamental understanding of long term CO2 storage in rock formations. The imaging techniques include Confocal Laser Scanning Microscopy (CLSM), micro-CT and medical CT scanning, with spatial resolutions ranging from sub-micron to mm respectively. First, we report a new sample preparation technique to study micro-porosity in carbonates using CLSM in 3 dimensions. Second, we use micro-CT scanning to generate high resolution 3D pore space images of carbonate and cap rock samples. In addition, we employ micro-CT to image the processes of evaporation in fractures and cap rock degradation due to exposure to CO2 flow. Third, we use medical CT scanning to image spontaneous imbibition in carbonate rock samples. Our imaging studies are complemented by computer simulations of multi-phase flow and transport, using the 3D pore space images obtained from the scanning experiments. We have developed a massively parallel lattice-Boltzmann (LB) code to calculate the single phase flow field in these pore space images. The resulting flow fields are then used to calculate hydrodynamic dispersion using a novel scheme to predict probability distributions for molecular displacements using the LB method and a streamline algorithm, modified for optimal solid boundary conditions. We calculate solute transport on pore-space images of rock cores with increasing degree of heterogeneity: a bead pack, Bentheimer sandstone and Portland carbonate. We observe that for homogeneous rock samples, such as bead packs, the displacement distribution remains Gaussian with time increasing. In the more heterogeneous rocks, on the other hand, the displacement distribution develops a stagnant part. We observe that the fraction of trapped solute increases from the beadpack (0 %) to Bentheimer sandstone (1.5 %) to Portland carbonate (8.1 %), in excellent agreement with PFG-NMR experiments. We then use our preferred multi-phase model to directly calculate flow in pore space images of two different sandstones and observe excellent agreement with experimental relative permeabilities. Also we calculate cluster size distributions in good agreement with experimental studies. Our analysis shows that the simulations are able to predict both multi-phase flow and transport properties directly on large 3D pore space images of real rocks. Pore space images, left and velocity distributions, right (Yang and Boek, 2013)

  17. Time and space integrating acousto-optic folded spectrum processing for SETI

    NASA Technical Reports Server (NTRS)

    Wagner, K.; Psaltis, D.

    1986-01-01

    Time and space integrating folded spectrum techniques utilizing acousto-optic devices (AOD) as 1-D input transducers are investigated for a potential application as wideband, high resolution, large processing gain spectrum analyzers in the search for extra-terrestrial intelligence (SETI) program. The space integrating Fourier transform performed by a lens channels the coarse spectral components diffracted from an AOD onto an array of time integrating narrowband fine resolution spectrum analyzers. The pulsing action of a laser diode samples the interferometrically detected output, aliasing the fine resolution components to baseband, as required for the subsequent charge coupled devices (CCD) processing. The raster scan mechanism incorporated into the readout of the CCD detector array is used to unfold the 2-D transform, reproducing the desired high resolution Fourier transform of the input signal.

  18. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  19. General Rotorcraft Aeromechanical Stability Program (GRASP) - Theory Manual

    DTIC Science & Technology

    1990-10-01

    the A basis. Two symbols frequently encountered in vector operations that use index notation are the Kronecker delta eij and the Levi - Civita epsilon...Blade root cutout fijk Levi - Civita epsilon permutation symbol 0 pretwist angle 0’ pretwist per unit length (d;) Oi Tait-Bryan angles K~i moment strains...the components of the identity tensor in a Cartesian coordinate system, while the Levi Civita epsilon consists of components of the permutation

  20. Testing of Error-Correcting Sparse Permutation Channel Codes

    NASA Technical Reports Server (NTRS)

    Shcheglov, Kirill, V.; Orlov, Sergei S.

    2008-01-01

    A computer program performs Monte Carlo direct numerical simulations for testing sparse permutation channel codes, which offer strong error-correction capabilities at high code rates and are considered especially suitable for storage of digital data in holographic and volume memories. A word in a code of this type is characterized by, among other things, a sparseness parameter (M) and a fixed number (K) of 1 or "on" bits in a channel block length of N.

  1. Scrambled Sobol Sequences via Permutation

    DTIC Science & Technology

    2009-01-01

    LCG LCG64 LFG MLFG PMLCG Sobol Scrambler PermutationScrambler LinearScrambler <<uses>> PermuationFactory StaticFactory DynamicFactory <<uses>> Figure 3...Phy., 19:252–256, 1979. [2] Emanouil I. Atanassov. A new efficient algorithm for generating the scrambled sobol ’ sequence. In NMA ’02: Revised Papers...Deidre W.Evan, and Micheal Mascagni. On the scrambled sobol sequence. In ICCS2005, pages 775–782, 2005. [7] Richard Durstenfeld. Algorithm 235: Random

  2. A trajectory planning scheme for spacecraft in the space station environment. M.S. Thesis - University of California

    NASA Technical Reports Server (NTRS)

    Soller, Jeffrey Alan; Grunwald, Arthur J.; Ellis, Stephen R.

    1991-01-01

    Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is special because the space station will define a multivehicle environment in space. The optimization surface is a complex nonlinear function of the initial conditions of the chase and target crafts. Small permutations in the input conditions can result in abrupt changes to the optimization surface. Since no prior knowledge about the number or location of local minima on the surface is available, the optimization must be capable of functioning on a multimodal surface. It was reported in the literature that the simulated annealing algorithm is more effective on such surfaces than descent techniques using random starting points. The simulated annealing optimization was found to be capable of identifying a minimum fuel, two-burn trajectory subject to four constraints which are integrated into the optimization using a barrier method. The computations required to solve the optimization are fast enough that missions could be planned on board the space station. Potential applications for on board planning of missions are numerous. Future research topics may include optimal planning of multi-waypoint maneuvers using a knowledge base to guide the optimization, and a study aimed at developing robust annealing schedules for potential on board missions.

  3. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis

    PubMed Central

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-01-01

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods. PMID:29690526

  4. A Weak Quantum Blind Signature with Entanglement Permutation

    NASA Astrophysics Data System (ADS)

    Lou, Xiaoping; Chen, Zhigang; Guo, Ying

    2015-09-01

    Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.

  5. Phase Transitions in Definite Total Spin States of Two-Component Fermi Gases.

    PubMed

    Yurovsky, Vladimir A

    2017-05-19

    Second-order phase transitions have no latent heat and are characterized by a change in symmetry. In addition to the conventional symmetric and antisymmetric states under permutations of bosons and fermions, mathematical group-representation theory allows for non-Abelian permutation symmetry. Such symmetry can be hidden in states with defined total spins of spinor gases, which can be formed in optical cavities. The present work shows that the symmetry reveals itself in spin-independent or coordinate-independent properties of these gases, namely as non-Abelian entropy in thermodynamic properties. In weakly interacting Fermi gases, two phases appear associated with fermionic and non-Abelian symmetry under permutations of particle states, respectively. The second-order transitions between the phases are characterized by discontinuities in specific heat. Unlike other phase transitions, the present ones are not caused by interactions and can appear even in ideal gases. Similar effects in Bose gases and strong interactions are discussed.

  6. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis.

    PubMed

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-04-21

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods.

  7. An extended continuous estimation of distribution algorithm for solving the permutation flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2017-11-01

    This article proposes an extended continuous estimation of distribution algorithm (ECEDA) to solve the permutation flow-shop scheduling problem (PFSP). In ECEDA, to make a continuous estimation of distribution algorithm (EDA) suitable for the PFSP, the largest order value rule is applied to convert continuous vectors to discrete job permutations. A probabilistic model based on a mixed Gaussian and Cauchy distribution is built to maintain the exploration ability of the EDA. Two effective local search methods, i.e. revolver-based variable neighbourhood search and Hénon chaotic-based local search, are designed and incorporated into the EDA to enhance the local exploitation. The parameters of the proposed ECEDA are calibrated by means of a design of experiments approach. Simulation results and comparisons based on some benchmark instances show the efficiency of the proposed algorithm for solving the PFSP.

  8. Non-Cartesian Parallel Imaging Reconstruction

    PubMed Central

    Wright, Katherine L.; Hamilton, Jesse I.; Griswold, Mark A.; Gulani, Vikas; Seiberlich, Nicole

    2014-01-01

    Non-Cartesian parallel imaging has played an important role in reducing data acquisition time in MRI. The use of non-Cartesian trajectories can enable more efficient coverage of k-space, which can be leveraged to reduce scan times. These trajectories can be undersampled to achieve even faster scan times, but the resulting images may contain aliasing artifacts. Just as Cartesian parallel imaging can be employed to reconstruct images from undersampled Cartesian data, non-Cartesian parallel imaging methods can mitigate aliasing artifacts by using additional spatial encoding information in the form of the non-homogeneous sensitivities of multi-coil phased arrays. This review will begin with an overview of non-Cartesian k-space trajectories and their sampling properties, followed by an in-depth discussion of several selected non-Cartesian parallel imaging algorithms. Three representative non-Cartesian parallel imaging methods will be described, including Conjugate Gradient SENSE (CG SENSE), non-Cartesian GRAPPA, and Iterative Self-Consistent Parallel Imaging Reconstruction (SPIRiT). After a discussion of these three techniques, several potential promising clinical applications of non-Cartesian parallel imaging will be covered. PMID:24408499

  9. Time domain structures in a colliding magnetic flux rope experiment

    NASA Astrophysics Data System (ADS)

    Tang, Shawn Wenjie; Gekelman, Walter; Dehaas, Timothy; Vincena, Steve; Pribyl, Patrick

    2017-10-01

    Electron phase-space holes, regions of positive potential on the scale of the Debye length, have been observed in auroras as well as in laboratory experiments. These potential structures, also known as Time Domain Structures (TDS), are packets of intense electric field spikes that have significant components parallel to the local magnetic field. In an ongoing investigation at UCLA, TDS were observed on the surface of two magnetized flux ropes produced within the Large Plasma Device (LAPD). A barium oxide (BaO) cathode was used to produce an 18 m long magnetized plasma column and a lanthanum hexaboride (LaB6) source was used to create 11 m long kink unstable flux ropes. Using two probes capable of measuring the local electric and magnetic fields, correlation analysis was performed on tens of thousands of these structures and their propagation velocities, probability distribution function and spatial distribution were determined. The TDS became abundant as the flux ropes collided and appear to emanate from the reconnection region in between them. In addition, a preliminary analysis of the permutation entropy and statistical complexity of the data suggests that the TDS signals may be chaotic in nature. Work done at the Basic Plasma Science Facility (BaPSF) at UCLA which is supported by DOE and NSF.

  10. Computation of viscous blast wave flowfields

    NASA Technical Reports Server (NTRS)

    Atwood, Christopher A.

    1991-01-01

    A method to determine unsteady solutions of the Navier-Stokes equations was developed and applied. The structural finite-volume, approximately factored implicit scheme uses Newton subiterations to obtain the spatially and temporally second-order accurate time history of the interaction of blast-waves with stationary targets. The inviscid flux is evaluated using MacCormack's modified Steger-Warming flux or Roe flux difference splittings with total variation diminishing limiters, while the viscous flux is computed using central differences. The use of implicit boundary conditions in conjunction with a telescoping in time and space method permitted solutions to this strongly unsteady class of problems. Comparisons of numerical, analytical, and experimental results were made in two and three dimensions. These comparisons revealed accurate wave speed resolution with nonoscillatory discontinuity capturing. The purpose of this effort was to address the three-dimensional, viscous blast-wave problem. Test cases were undertaken to reveal these methods' weaknesses in three regimes: (1) viscous-dominated flow; (2) complex unsteady flow; and (3) three-dimensional flow. Comparisons of these computations to analytic and experimental results provided initial validation of the resultant code. Addition details on the numerical method and on the validation can be found in the appendix. Presently, the code is capable of single zone computations with selection of any permutation of solid wall or flow-through boundaries.

  11. Virtobot--a multi-functional robotic system for 3D surface scanning and automatic post mortem biopsy.

    PubMed

    Ebert, Lars Christian; Ptacek, Wolfgang; Naether, Silvio; Fürst, Martin; Ross, Steffen; Buck, Ursula; Weber, Stefan; Thali, Michael

    2010-03-01

    The Virtopsy project, a multi-disciplinary project that involves forensic science, diagnostic imaging, computer science, automation technology, telematics and biomechanics, aims to develop new techniques to improve the outcome of forensic investigations. This paper presents a new approach in the field of minimally invasive virtual autopsy for a versatile robotic system that is able to perform three-dimensional (3D) surface scans as well as post mortem image-guided soft tissue biopsies. The system consists of an industrial six-axis robot with additional extensions (i.e. a linear axis to increase working space, a tool-changing system and a dedicated safety system), a multi-slice CT scanner with equipment for angiography, a digital photogrammetry and 3D optical surface-scanning system, a 3D tracking system, and a biopsy end effector for automatic needle placement. A wax phantom was developed for biopsy accuracy tests. Surface scanning times were significantly reduced (scanning times cut in half, calibration three times faster). The biopsy module worked with an accuracy of 3.2 mm. Using the Virtobot, the surface-scanning procedure could be standardized and accelerated. The biopsy module is accurate enough for use in biopsies in a forensic setting. The Virtobot can be utilized for several independent tasks in the field of forensic medicine, and is sufficiently versatile to be adapted to different tasks in the future. (c) 2009 John Wiley & Sons, Ltd.

  12. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  13. Complexity analysis of the turbulent environmental fluid flow time series

    NASA Astrophysics Data System (ADS)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  14. Completing the gaps in Kilauea's Father's Day InSAR displacement signature with ScanSAR

    NASA Astrophysics Data System (ADS)

    Bertran Ortiz, A.; Pepe, A.; Lanari, R.; Lundgren, P.; Rosen, P. A.

    2009-12-01

    Currently there are gaps in the known displacement signature obtained with InSAR at Kilauea between 2002 and 2009. InSAR data can be richer than GPS because of denser spatial cover. However, to better model rapidly varying and non-steady geophysical events InSAR is limited because of its less dense time observations of the area under study. The ScanSAR mode currently available in several satellites mitigates this effect because the satellite may illuminate a given area more than once within an orbit cycle. The Kilauea displacement graph below from Instituto per Il Rilevamento Electromagnetico dell'Ambiente (IREA) is a cut in space of the displacement signature obtained from a time series of several stripmap-to-stripmap interferograms. It shows that critical information is missing, especially between 2006 and 2007. The displacement is expected to be non-linear judging from the 2007-2008 displacement signature, thus simple interpolation would not suffice. The gap can be filled by incorporating Envisat stripmap-to-ScanSAR interferograms available during that time period. We propose leveraging JPL's new ROI-PAC ScanSAR module to create stripmap-to-ScanSAR interferograms. The new interferograms will be added to the stripmap ones in order to extend the existing stripmap time series generated by using the Small BAseline Subset (SBAS) technique. At AGU we will present denser graphs that better capture Kilauea's displacement between 2003 and 2009.

  15. An Improved Source-Scanning Algorithm for Locating Earthquake Clusters or Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Liao, Y.; Kao, H.; Hsu, S.

    2010-12-01

    The Source-scanning Algorithm (SSA) was originally introduced in 2004 to locate non-volcanic tremors. Its application was later expanded to the identification of earthquake rupture planes and the near-real-time detection and monitoring of landslides and mud/debris flows. In this study, we further improve SSA for the purpose of locating earthquake clusters or aftershock sequences when only a limited number of waveform observations are available. The main improvements include the application of a ground motion analyzer to separate P and S waves, the automatic determination of resolution based on the grid size and time step of the scanning process, and a modified brightness function to utilize constraints from multiple phases. Specifically, the improved SSA (named as ISSA) addresses two major issues related to locating earthquake clusters/aftershocks. The first one is the massive amount of both time and labour to locate a large number of seismic events manually. And the second one is to efficiently and correctly identify the same phase across the entire recording array when multiple events occur closely in time and space. To test the robustness of ISSA, we generate synthetic waveforms consisting of 3 separated events such that individual P and S phases arrive at different stations in different order, thus making correct phase picking nearly impossible. Using these very complicated waveforms as the input, the ISSA scans all model space for possible combination of time and location for the existence of seismic sources. The scanning results successfully associate various phases from each event at all stations, and correctly recover the input. To further demonstrate the advantage of ISSA, we apply it to the waveform data collected by a temporary OBS array for the aftershock sequence of an offshore earthquake southwest of Taiwan. The overall signal-to-noise ratio is inadequate for locating small events; and the precise arrival times of P and S phases are difficult to determine. We use one of the largest aftershocks that can be located by conventional methods as our reference event to calibrate the controlling parameters of ISSA. These parameters include the overall Vp/Vs ratio (because a precise S velocity model was unavailable), the length of scanning time window, and the weighting factor for each station. Our results show that ISSA is not only more efficient in locating earthquake clusters/aftershocks, but also capable of identifying many events missed by conventional phase-picking methods.

  16. Applications of asynoptic space - Time Fourier transform methods to scanning satellite measurements

    NASA Technical Reports Server (NTRS)

    Lait, Leslie R.; Stanford, John L.

    1988-01-01

    A method proposed by Salby (1982) for computing the zonal space-time Fourier transform of asynoptically acquired satellite data is discussed. The method and its relationship to other techniques are briefly described, and possible problems in applying it to real data are outlined. Examples of results obtained using this technique are given which demonstrate its sensitivity to small-amplitude signals. A number of waves are found which have previously been observed as well as two not heretofore reported. A possible extension of the method which could increase temporal and longitudinal resolution is described.

  17. Development of facial sexual dimorphism in children aged between 12 and 15 years: a three-dimensional longitudinal study.

    PubMed

    Koudelová, J; Brůžek, J; Cagáňová, V; Krajíček, V; Velemínská, J

    2015-08-01

    To evaluate sexual dimorphism of facial form and shape and to describe differences between the average female and male face from 12 to 15 years. Overall 120 facial scans from healthy Caucasian children (17 boys, 13 girls) were longitudinally evaluated over a 4-year period between the ages of 12 and 15 years. Facial surface scans were obtained using a three-dimensional optical scanner Vectra-3D. Variation in facial shape and form was evaluated using geometric morphometric and statistical methods (DCA, PCA and permutation test). Average faces were superimposed, and the changes were evaluated using colour-coded maps. There were no significant sex differences (p > 0.05) in shape in any age category and no differences in form in the 12- and 13-year-olds, as the female faces were within the area of male variability. From the age of 14, a slight separation occurred, which was statistically confirmed. The differences were mainly associated with size. Generally boys had more prominent eyebrow ridges, more deeply set eyes, a flatter cheek area, and a more prominent nose and chin area. The development of facial sexual dimorphism during pubertal growth is connected with ontogenetic allometry. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Characterization of complexities in combustion instability in a lean premixed gas-turbine model combustor.

    PubMed

    Gotoda, Hiroshi; Amano, Masahito; Miyano, Takaya; Ikawa, Takuya; Maki, Koshiro; Tachibana, Shigeru

    2012-12-01

    We characterize complexities in combustion instability in a lean premixed gas-turbine model combustor by nonlinear time series analysis to evaluate permutation entropy, fractal dimensions, and short-term predictability. The dynamic behavior in combustion instability near lean blowout exhibits a self-affine structure and is ascribed to fractional Brownian motion. It undergoes chaos by the onset of combustion oscillations with slow amplitude modulation. Our results indicate that nonlinear time series analysis is capable of characterizing complexities in combustion instability close to lean blowout.

  19. 3D sensitivity encoded ellipsoidal MR spectroscopic imaging of gliomas at 3T☆

    PubMed Central

    Ozturk-Isik, Esin; Chen, Albert P.; Crane, Jason C.; Bian, Wei; Xu, Duan; Han, Eric T.; Chang, Susan M.; Vigneron, Daniel B.; Nelson, Sarah J.

    2010-01-01

    Purpose The goal of this study was to implement time efficient data acquisition and reconstruction methods for 3D magnetic resonance spectroscopic imaging (MRSI) of gliomas at a field strength of 3T using parallel imaging techniques. Methods The point spread functions, signal to noise ratio (SNR), spatial resolution, metabolite intensity distributions and Cho:NAA ratio of 3D ellipsoidal, 3D sensitivity encoding (SENSE) and 3D combined ellipsoidal and SENSE (e-SENSE) k-space sampling schemes were compared with conventional k-space data acquisition methods. Results The 3D SENSE and e-SENSE methods resulted in similar spectral patterns as the conventional MRSI methods. The Cho:NAA ratios were highly correlated (P<.05 for SENSE and P<.001 for e-SENSE) with the ellipsoidal method and all methods exhibited significantly different spectral patterns in tumor regions compared to normal appearing white matter. The geometry factors ranged between 1.2 and 1.3 for both the SENSE and e-SENSE spectra. When corrected for these factors and for differences in data acquisition times, the empirical SNRs were similar to values expected based upon theoretical grounds. The effective spatial resolution of the SENSE spectra was estimated to be same as the corresponding fully sampled k-space data, while the spectra acquired with ellipsoidal and e-SENSE k-space samplings were estimated to have a 2.36–2.47-fold loss in spatial resolution due to the differences in their point spread functions. Conclusion The 3D SENSE method retained the same spatial resolution as full k-space sampling but with a 4-fold reduction in scan time and an acquisition time of 9.28 min. The 3D e-SENSE method had a similar spatial resolution as the corresponding ellipsoidal sampling with a scan time of 4:36 min. Both parallel imaging methods provided clinically interpretable spectra with volumetric coverage and adequate SNR for evaluating Cho, Cr and NAA. PMID:19766422

  20. Potential energy surface fitting by a statistically localized, permutationally invariant, local interpolating moving least squares method for the many-body potential: Method and application to N{sub 4}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Jason D.; Doraiswamy, Sriram; Candler, Graham V., E-mail: truhlar@umn.edu, E-mail: candler@aem.umn.edu

    2014-02-07

    Fitting potential energy surfaces to analytic forms is an important first step for efficient molecular dynamics simulations. Here, we present an improved version of the local interpolating moving least squares method (L-IMLS) for such fitting. Our method has three key improvements. First, pairwise interactions are modeled separately from many-body interactions. Second, permutational invariance is incorporated in the basis functions, using permutationally invariant polynomials in Morse variables, and in the weight functions. Third, computational cost is reduced by statistical localization, in which we statistically correlate the cutoff radius with data point density. We motivate our discussion in this paper with amore » review of global and local least-squares-based fitting methods in one dimension. Then, we develop our method in six dimensions, and we note that it allows the analytic evaluation of gradients, a feature that is important for molecular dynamics. The approach, which we call statistically localized, permutationally invariant, local interpolating moving least squares fitting of the many-body potential (SL-PI-L-IMLS-MP, or, more simply, L-IMLS-G2), is used to fit a potential energy surface to an electronic structure dataset for N{sub 4}. We discuss its performance on the dataset and give directions for further research, including applications to trajectory calculations.« less

  1. Single scan parameterization of space-variant point spread functions in image space via a printed array: the impact for two PET/CT scanners.

    PubMed

    Kotasidis, F A; Matthews, J C; Angelis, G I; Noonan, P J; Jackson, A; Price, P; Lionheart, W R; Reader, A J

    2011-05-21

    Incorporation of a resolution model during statistical image reconstruction often produces images of improved resolution and signal-to-noise ratio. A novel and practical methodology to rapidly and accurately determine the overall emission and detection blurring component of the system matrix using a printed point source array within a custom-made Perspex phantom is presented. The array was scanned at different positions and orientations within the field of view (FOV) to examine the feasibility of extrapolating the measured point source blurring to other locations in the FOV and the robustness of measurements from a single point source array scan. We measured the spatially-variant image-based blurring on two PET/CT scanners, the B-Hi-Rez and the TruePoint TrueV. These measured spatially-variant kernels and the spatially-invariant kernel at the FOV centre were then incorporated within an ordinary Poisson ordered subset expectation maximization (OP-OSEM) algorithm and compared to the manufacturer's implementation using projection space resolution modelling (RM). Comparisons were based on a point source array, the NEMA IEC image quality phantom, the Cologne resolution phantom and two clinical studies (carbon-11 labelled anti-sense oligonucleotide [(11)C]-ASO and fluorine-18 labelled fluoro-l-thymidine [(18)F]-FLT). Robust and accurate measurements of spatially-variant image blurring were successfully obtained from a single scan. Spatially-variant resolution modelling resulted in notable resolution improvements away from the centre of the FOV. Comparison between spatially-variant image-space methods and the projection-space approach (the first such report, using a range of studies) demonstrated very similar performance with our image-based implementation producing slightly better contrast recovery (CR) for the same level of image roughness (IR). These results demonstrate that image-based resolution modelling within reconstruction is a valid alternative to projection-based modelling, and that, when using the proposed practical methodology, the necessary resolution measurements can be obtained from a single scan. This approach avoids the relatively time-consuming and involved procedures previously proposed in the literature.

  2. Group-theoretic models of the inversion process in bacterial genomes.

    PubMed

    Egri-Nagy, Attila; Gebhardt, Volker; Tanaka, Mark M; Francis, Andrew R

    2014-07-01

    The variation in genome arrangements among bacterial taxa is largely due to the process of inversion. Recent studies indicate that not all inversions are equally probable, suggesting, for instance, that shorter inversions are more frequent than longer, and those that move the terminus of replication are less probable than those that do not. Current methods for establishing the inversion distance between two bacterial genomes are unable to incorporate such information. In this paper we suggest a group-theoretic framework that in principle can take these constraints into account. In particular, we show that by lifting the problem from circular permutations to the affine symmetric group, the inversion distance can be found in polynomial time for a model in which inversions are restricted to acting on two regions. This requires the proof of new results in group theory, and suggests a vein of new combinatorial problems concerning permutation groups on which group theorists will be needed to collaborate with biologists. We apply the new method to inferring distances and phylogenies for published Yersinia pestis data.

  3. Genetic variations in the serotonergic system contribute to amygdala volume in humans.

    PubMed

    Li, Jin; Chen, Chunhui; Wu, Karen; Zhang, Mingxia; Zhu, Bi; Chen, Chuansheng; Moyzis, Robert K; Dong, Qi

    2015-01-01

    The amygdala plays a critical role in emotion processing and psychiatric disorders associated with emotion dysfunction. Accumulating evidence suggests that amygdala structure is modulated by serotonin-related genes. However, there is a gap between the small contributions of single loci (less than 1%) and the reported 63-65% heritability of amygdala structure. To understand the "missing heritability," we systematically explored the contribution of serotonin genes on amygdala structure at the gene set level. The present study of 417 healthy Chinese volunteers examined 129 representative polymorphisms in genes from multiple biological mechanisms in the regulation of serotonin neurotransmission. A system-level approach using multiple regression analyses identified that nine SNPs collectively accounted for approximately 8% of the variance in amygdala volume. Permutation analyses showed that the probability of obtaining these findings by chance was low (p = 0.043, permuted for 1000 times). Findings showed that serotonin genes contribute moderately to individual differences in amygdala volume in a healthy Chinese sample. These results indicate that the system-level approach can help us to understand the genetic basis of a complex trait such as amygdala structure.

  4. Monitoring the informational efficiency of European corporate bond markets with dynamical permutation min-entropy

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2016-08-01

    In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.

  5. Algorithm for transforming the coordinates of lunar objects while changing from various coordinate systems into the selenocentric one

    NASA Astrophysics Data System (ADS)

    Mazurova, Elena; Mikhaylov, Aleksandr

    2013-04-01

    The selenocentric network of objects setting the coordinate system on the Moon, with the origin coinciding with the mass centre and axes directed along the inertia axes can become one of basic elements of the coordinate-time support for lunar navigation with use of cartographic materials and control objects. A powerful array of highly-precise and multiparameter information obtained by modern space vehicles allows one to establish Lunar Reference Frames (LRF) of an essentially another accuracy. Here, a special role is played by the results of scanning the lunar surface by the Lunar Reconnaissance Orbiter(LRO) American mission. The coordinates of points calculated only from the results of laser scanning have high enough accuracy of position definition with respect to each other, but it is possible to check up the real accuracy of spatial tie and improve the coordinates only by a network of points whose coordinates are computed both from laser scanning and other methods too, for example, by terrestrial laser location, space photogrammetry methods, and so on. The paper presents the algorithm for transforming selenocentric coordinate systems and the accuracy estimation of changing from one lunar coordinate system to another one. Keywords: selenocentric coordinate system, coordinate-time support.

  6. A synthetic aperture radio telescope for ICME observations as a potential payload of SPORT

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Sun, W.; Liu, H.; Xiong, M.; Liu, Y. D.; Wu, J.

    2013-12-01

    We introduce a potential payload for the Solar Polar ORbit Telescope (SPORT), a space weather mission proposed by the National Space Science Center, Chinese Academy of Sciences. This is a synthetic aperture radio imager designed to detect radio emissions from interplanetary coronal mass ejections (ICMEs), which is expected to be an important instrument to monitor the propagation and evolution of ICMEs. The radio telescope applies a synthetic aperture interferometric technique to measure the brightness temperature of ICMEs. Theoretical calculations of the brightness temperature utilizing statistical properties of ICMEs and the background solar wind indicate that ICMEs within 0.35 AU from the Sun are detectable by a radio telescope at a frequency <= 150 MHz with a sensitivity of <=1 K. The telescope employs a time shared double rotation scan (also called a clock scan), where two coplanar antennas revolve around a fixed axis at different radius and speed, to fulfill sampling of the brightness temperature. An array of 4+4 elements with opposite scanning directions are developed for the radio telescope to achieve the required sensitivity (<=1K) within the imaging refreshing time (~30 minutes). This scan scheme is appropriate for a three-axis stabilized spacecraft platform while keeping a good sampling pattern. We also discuss how we select the operating frequency, which involves a trade-off between the engineering feasibility and the scientific goal. Our preliminary results indicate that the central frequency of 150 MHz with a bandwidth of 20 MHz, which requires arm lengths of the two groups of 14m and 16m, respectively, gives an angular resolution of 2°, a field of view of ×25° around the Sun, and a time resolution of 30 minutes.

  7. Rapid and Accurate Multiple Testing Correction and Power Estimation for Millions of Correlated Markers

    PubMed Central

    Han, Buhm; Kang, Hyun Min; Eskin, Eleazar

    2009-01-01

    With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255

  8. UniScan technology for innovative laboratory at a university for acquisition data from space in real-time

    NASA Astrophysics Data System (ADS)

    Gershenzon, V.; Gershenzon, O.; Sergeeva, M.; Ippolitov, V.; Targulyan, O.

    2012-04-01

    Keywords: Remote Sensing, UniScan ground station, Education, Monitoring. Remote Sensing Centers allowing real-time imagery acquisition from Earth observing satellites within the structure of Universities provides proper environment for innovative education. It delivers the efficient training for scientific and academic and teaching personnel, secure the role of the young professionals in science, education and hi-tech, and maintain the continuity of generations in science and education. Article is based on experience for creation such centers in more than 20 higher education institutions in Russia, Kazakhstan, and Spain on the base of UniScan ground station by R&D Center ScanEx. These stations serve as the basis for Earth monitoring from space providing the training and advanced training to produce the specialists having the state-of-the-art knowledge in Earth Remote Sensing and GIS, as well as the land-use monitoring and geo-data service for the economic operators in such diverse areas as the nature resource management, agriculture, land property management, disasters monitoring, etc. Currently our proposal of UniScan for universities all over the world allows to receive low resolution free of charge MODIS data from Terra and Aqua satellites, VIIRS from the NPP mission, and also high resolution optical images from EROS A and radar images from Radarsat-1 satellites, including the telemetry for the first year of operation, within the footprint of up to 2,500 kilometers in radius. Creation remote sensing centers at universities will lead to a new quality level for education and scientific studies and will enable to make education system in such innovation institutions open to modern research work and economy.

  9. Method of composing two-dimensional scanned spectra observed by the New Vacuum Solar Telescope

    NASA Astrophysics Data System (ADS)

    Cai, Yun-Fang; Xu, Zhi; Chen, Yu-Chao; Xu, Jun; Li, Zheng-Gang; Fu, Yu; Ji, Kai-Fan

    2018-04-01

    In this paper we illustrate the technique used by the New Vacuum Solar Telescope (NVST) to increase the spatial resolution of two-dimensional (2D) solar spectroscopy observations involving two dimensions of space and one of wavelength. Without an image stabilizer at the NVST, large scale wobble motion is present during the spatial scanning, whose instantaneous amplitude can reach 1.3″ due to the Earth’s atmosphere and the precision of the telescope guiding system, and seriously decreases the spatial resolution of 2D spatial maps composed with scanned spectra. We make the following effort to resolve this problem: the imaging system (e.g., the TiO-band) is used to record and detect the displacement vectors of solar image motion during the raster scan, in both the slit and scanning directions. The spectral data (e.g., the Hα line) which are originally obtained in time sequence are corrected and re-arranged in space according to those displacement vectors. Raster scans are carried out in several active regions with different seeing conditions (two rasters are illustrated in this paper). Given a certain spatial sampling and temporal resolution, the spatial resolution of the composed 2D map could be close to that of the slit-jaw image. The resulting quality after correction is quantitatively evaluated with two methods. A physical quantity, such as the line-of-sight velocities in multiple layers of the solar atmosphere, is also inferred from the re-arranged spectrum, demonstrating the advantage of this technique.

  10. Spatial and spatiotemporal pattern analysis of coconut lethal yellowing in Mozambique.

    PubMed

    Bonnot, F; de Franqueville, H; Lourenço, E

    2010-04-01

    Coconut lethal yellowing (LY) is caused by a phytoplasma and is a major threat for coconut production throughout its growing area. Incidence of LY was monitored visually on every coconut tree in six fields in Mozambique for 34 months. Disease progress curves were plotted and average monthly disease incidence was estimated. Spatial patterns of disease incidence were analyzed at six assessment times. Aggregation was tested by the coefficient of spatial autocorrelation of the beta-binomial distribution of diseased trees in quadrats. The binary power law was used as an assessment of overdispersion across the six fields. Spatial autocorrelation between symptomatic trees was measured by the BB join count statistic based on the number of pairs of diseased trees separated by a specific distance and orientation, and tested using permutation methods. Aggregation of symptomatic trees was detected in every field in both cumulative and new cases. Spatiotemporal patterns were analyzed with two methods. The proximity of symptomatic trees at two assessment times was investigated using the spatiotemporal BB join count statistic based on the number of pairs of trees separated by a specific distance and orientation and exhibiting the first symptoms of LY at the two times. The semivariogram of times of appearance of LY was calculated to characterize how the lag between times of appearance of LY was related to the distance between symptomatic trees. Both statistics were tested using permutation methods. A tendency for new cases to appear in the proximity of previously diseased trees and a spatially structured pattern of times of appearance of LY within clusters of diseased trees were detected, suggesting secondary spread of the disease.

  11. Availability of feature-oriented scanning probe microscopy for remote-controlled measurements on board a space laboratory or planet exploration Rover.

    PubMed

    Lapshin, Rostislav V

    2009-06-01

    Prospects for a feature-oriented scanning (FOS) approach to investigations of sample surfaces, at the micrometer and nanometer scales, with the use of scanning probe microscopy under space laboratory or planet exploration rover conditions, are examined. The problems discussed include decreasing sensitivity of the onboard scanning probe microscope (SPM) to temperature variations, providing autonomous operation, implementing the capabilities for remote control, self-checking, self-adjustment, and self-calibration. A number of topical problems of SPM measurements in outer space or on board a planet exploration rover may be solved via the application of recently proposed FOS methods.

  12. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic.

    PubMed

    Xie, Kun; Fox, Grace E; Liu, Jun; Lyu, Cheng; Lee, Jason C; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies-the long-presumed computational motif-are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic ( N = 2 i -1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors-the synaptic switch for learning and memory-were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques-which preferentially encode specific and low-combinatorial features and project inter-cortically-is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6-which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems-is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain's basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex.

  13. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

    PubMed Central

    Xie, Kun; Fox, Grace E.; Liu, Jun; Lyu, Cheng; Lee, Jason C.; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z.

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies—the long-presumed computational motif—are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic (N = 2i–1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors—the synaptic switch for learning and memory—were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques—which preferentially encode specific and low-combinatorial features and project inter-cortically—is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6—which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems—is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain’s basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex. PMID:27895562

  14. Magnetic flux density reconstruction using interleaved partial Fourier acquisitions in MREIT.

    PubMed

    Park, Hee Myung; Nam, Hyun Soo; Kwon, Oh In

    2011-04-07

    Magnetic resonance electrical impedance tomography (MREIT) has been introduced as a non-invasive modality to visualize the internal conductivity and/or current density of an electrically conductive object by the injection of current. In order to measure a magnetic flux density signal in MREIT, the phase difference approach in an interleaved encoding scheme cancels the systematic artifacts accumulated in phase signals and also reduces the random noise effect. However, it is important to reduce scan duration maintaining spatial resolution and sufficient contrast, in order to allow for practical in vivo implementation of MREIT. The purpose of this paper is to develop a coupled partial Fourier strategy in the interleaved sampling in order to reduce the total imaging time for an MREIT acquisition, whilst maintaining an SNR of the measured magnetic flux density comparable to what is achieved with complete k-space data. The proposed method uses two key steps: one is to update the magnetic flux density by updating the complex densities using the partially interleaved k-space data and the other is to fill in the missing k-space data iteratively using the updated background field inhomogeneity and magnetic flux density data. Results from numerical simulations and animal experiments demonstrate that the proposed method reduces considerably the scanning time and provides resolution of the recovered B(z) comparable to what is obtained from complete k-space data.

  15. The simple perfection of quantum correlation in human vision.

    PubMed

    Bouman, Maarten A

    2006-01-01

    A theory is presented that specifies the amount of light that is needed for the perception of any stimulus that is defined in space, time and color. For detection and discrimination mechanistic neural elements with deterministic procedures exist. Twin pairs of red and green cones are ordered in three sets along clockwise and counter clockwise revolving spirals and along circles around the center of the fovea. In the rod-free fovea the red pairs are ordered along the spirals and the green along the circles. Each cone is accompanied by--dependent on retinal eccentricity--up to 100 satellite rods. For the retinal signal processing such a receptor group constitutes a space-quantum in analogy with time-quanta of about 0.04 s. In the peripheral retina the red and green twin pairs of space-quanta are roughly ordered along and at random distributed over the spirals and circles. Over each time-quantum, the cone and rods of a space-quantum sum their responses in a common nerve circuit of the luminosity channel. The summation's results from twin pairs of the same set of space-quanta are correlated by two-fold spatio-temporal coincidence mechanisms in the retina. Their outcome signals the perception of light, movement and edge. In the fused binocular visual field the movement and edge signals of the three sets from both eyes perfectly join vectorially together, provided the responding pairs of space-quanta are binocularly in perfect register as they normally are. The receptor's Weber gain control makes the receptor an all-or-none-system. The space-quantum's De Vries gain control makes its sensitivity equal to the average of the poisson fluctuations in quantum absorption per time-quantum. The controls are based on, respectively, arithmetically feed forward and backward inhibitive nerve mechanisms. The thermal noise of the photo-pigment resets the controls. The response to the second quantum absorption in a time-quantum in the individual rod, red or green cone has accession to the white, red or green nerve color circuit, respectively, and produces there a corresponding color signal. Already a single absorption in a blue cone is for a blue signal. In the retina, for the generation of yellow signals, the color circuits of individual red and green cones of each mixed entwined triple of red and green twin pairs of space-quanta are cross-connected through a nerve opponent color circuit. In the lateral geniculate nucleus in groups of seven neighboring triples, through two nerve opponent color circuits that are common for the two eyes together, the red and green signals as well as the yellow and blue mutually annihilate each other's color. White signals remain. In anomalous trichromacy, the space-quanta of some pairs have different cones or in one of them the cone is missing. In dichromacy, all pairs have different cones or one type of cones is missing. For perceptive resolution the periodic scanning of the retinal image by the eye tremor in synchrony with the time-quanta, overrules the limit of optical resolution as set by diffraction in the eye optics. Dependent on pupil diameter the scanning contributes up to a factor of about 30 to resolution. The action potentials of the Purkinje cells in the myocardium generate the time-quanta of the central nervous system as well as the mechanical scanning of the retinal image through the synchronic periodic variation of the tonus in the eye muscles.

  16. Crossbar Switches For Optical Data-Communication Networks

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.

    1994-01-01

    Optoelectronic and electro-optical crossbar switches called "permutation engines" (PE's) developed to route packets of data through fiber-optic communication networks. Basic network concept described in "High-Speed Optical Wide-Area Data-Communication Network" (NPO-18983). Nonblocking operation achieved by decentralized switching and control scheme. Each packet routed up or down in each column of this 5-input/5-output permutation engine. Routing algorithm ensures each packet arrives at its designated output port without blocking any other packet that does not contend for same output port.

  17. Security of the Five-Round KASUMI Type Permutation

    NASA Astrophysics Data System (ADS)

    Iwata, Tetsu; Yagi, Tohru; Kurosawa, Kaoru

    KASUMI is a blockcipher that forms the heart of the 3GPP confidentiality and integrity algorithms. In this paper, we study the security of the five-round KASUMI type permutations, and derive a highly non-trivial security bound against adversaries with adaptive chosen plaintext and chosen ciphertext attacks. To derive our security bound, we heavily use the tools from graph theory. However the result does not show its super-pseudorandomness, this gives us a strong evidence that the design of KASUMI is sound.

  18. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  19. Spatiotemporal image correlation spectroscopy (STICS) theory, verification, and application to protein velocity mapping in living CHO cells.

    PubMed

    Hebert, Benedict; Costantino, Santiago; Wiseman, Paul W

    2005-05-01

    We introduce a new extension of image correlation spectroscopy (ICS) and image cross-correlation spectroscopy (ICCS) that relies on complete analysis of both the temporal and spatial correlation lags for intensity fluctuations from a laser-scanning microscopy image series. This new approach allows measurement of both diffusion coefficients and velocity vectors (magnitude and direction) for fluorescently labeled membrane proteins in living cells through monitoring of the time evolution of the full space-time correlation function. By using filtering in Fourier space to remove frequencies associated with immobile components, we are able to measure the protein transport even in the presence of a large fraction (>90%) of immobile species. We present the background theory, computer simulations, and analysis of measurements on fluorescent microspheres to demonstrate proof of principle, capabilities, and limitations of the method. We demonstrate mapping of flow vectors for mixed samples containing fluorescent microspheres with different emission wavelengths using space time image cross-correlation. We also present results from two-photon laser-scanning microscopy studies of alpha-actinin/enhanced green fluorescent protein fusion constructs at the basal membrane of living CHO cells. Using space-time image correlation spectroscopy (STICS), we are able to measure protein fluxes with magnitudes of mum/min from retracting lamellar regions and protrusions for adherent cells. We also demonstrate the measurement of correlated directed flows (magnitudes of mum/min) and diffusion of interacting alpha5 integrin/enhanced cyan fluorescent protein and alpha-actinin/enhanced yellow fluorescent protein within living CHO cells. The STICS method permits us to generate complete transport maps of proteins within subregions of the basal membrane even if the protein concentration is too high to perform single particle tracking measurements.

  20. Towards topological quantum computer

    NASA Astrophysics Data System (ADS)

    Melnikov, D.; Mironov, A.; Mironov, S.; Morozov, A.; Morozov, An.

    2018-01-01

    Quantum R-matrices, the entangling deformations of non-entangling (classical) permutations, provide a distinguished basis in the space of unitary evolutions and, consequently, a natural choice for a minimal set of basic operations (universal gates) for quantum computation. Yet they play a special role in group theory, integrable systems and modern theory of non-perturbative calculations in quantum field and string theory. Despite recent developments in those fields the idea of topological quantum computing and use of R-matrices, in particular, practically reduce to reinterpretation of standard sets of quantum gates, and subsequently algorithms, in terms of available topological ones. In this paper we summarize a modern view on quantum R-matrix calculus and propose to look at the R-matrices acting in the space of irreducible representations, which are unitary for the real-valued couplings in Chern-Simons theory, as the fundamental set of universal gates for topological quantum computer. Such an approach calls for a more thorough investigation of the relation between topological invariants of knots and quantum algorithms.

  1. Realizability of a model in infinite statistics

    NASA Astrophysics Data System (ADS)

    Zagier, Don

    1992-06-01

    Following Greenberg and others, we study a space with a collection of operators a(k) satisfying the “ q-mutator relations” a(l)a † (k)a(l)=δ k,l (corresponding for q=±1 to classical Bose and Fermi statistics). We show that the n!×n! matrix A n (q) representing the scalar products of n-particle states is positive definite for all n if q lies between -1 and +1, so that the commutator relations have a Hilbert space representation in this case (this has also been proved by Fivel and by Bozejko and Speicher). We also give an explicit factorization of A n (q) as a product of matrices of the form (1-q jT)±1 with 1≦ j≦ n and T a permutation matrix. In particular, A n (q) is singular if and only if q M=1 for some integer M of the form k 2- k, 2≦ k≦ n.

  2. An ensemble of SVM classifiers based on gene pairs.

    PubMed

    Tong, Muchenxuan; Liu, Kun-Hong; Xu, Chungui; Ju, Wenbin

    2013-07-01

    In this paper, a genetic algorithm (GA) based ensemble support vector machine (SVM) classifier built on gene pairs (GA-ESP) is proposed. The SVMs (base classifiers of the ensemble system) are trained on different informative gene pairs. These gene pairs are selected by the top scoring pair (TSP) criterion. Each of these pairs projects the original microarray expression onto a 2-D space. Extensive permutation of gene pairs may reveal more useful information and potentially lead to an ensemble classifier with satisfactory accuracy and interpretability. GA is further applied to select an optimized combination of base classifiers. The effectiveness of the GA-ESP classifier is evaluated on both binary-class and multi-class datasets. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Accuracy of parameter estimates for closely spaced optical targets using multiple detectors

    NASA Astrophysics Data System (ADS)

    Dunn, K. P.

    1981-10-01

    In order to obtain the cross-scan position of an optical target, more than one scanning detector is used. As expected, the cross-scan position estimation performance degrades when two nearby optical targets interfere with each other. Theoretical bounds on the two-dimensional parameter estimation performance for two closely spaced optical targets are found. Two particular classes of scanning detector arrays, namely, the crow's foot and the brickwall (or mosaic) patterns, are considered.

  4. Effect of parameters on picosecond laser ablation of Cr12MoV cold work mold steel

    NASA Astrophysics Data System (ADS)

    Wu, Baoye; Liu, Peng; Zhang, Fei; Duan, Jun; Wang, Xizhao; Zeng, Xiaoyan

    2018-01-01

    Cr12MoV cold work mold steel, which is a difficult-to-machining material, is widely used in the mold and dye industry. A picosecond pulse Nd:YVO4 laser at 1064 nm was used to conduct the study. Effects of operation parameters (i.e., laser fluence, scanning speed, hatched space and number of scans) were studied on ablation depth and quality of Cr12MoV at the repetition rate of 20 MHz. The experimental results reveal that all the four parameters affect the ablation depth significantly. While the surface roughness depends mainly on laser fluence or scanning speed and secondarily on hatched space or number of scans. For laser fluence and scanning speed, three distinct surface morphologies were observed experiencing transition from flat (Ra < 1.40 μm) to bumpy (Ra = 1.40 - 2.40 μm) eventually to rough (Ra > 2.40 μm). However, for hatched space and number of scan, there is a small bumpy and rough zone or even no rough zone. Mechanisms including heat accumulation, plasma shielding and combustion reaction effects are proposed based on the ablation depth and processing morphology. By appropriate management of the laser fluence and scanning speed, high ablation depth with low surface roughness can be obtained at small hatched space and high number of scans.

  5. Space Communications and Navigation (SCaN) Integrated Network Architecture Definition Document (ADD). Volume 1; Executive Summary; Revision 1

    NASA Technical Reports Server (NTRS)

    Younes, Badri A.; Schier, James S.

    2010-01-01

    The SCaN Program has defined an integrated network architecture that fully meets the Administrator s mandate to the Program, and will result in a NASA infrastructure capable of providing the needed and enabling communications services to future space missions. The integrated network architecture will increase SCaN operational efficiency and interoperability through standardization, commonality and technology infusion. It will enable NASA missions requiring advanced communication and tracking capabilities such as: a. Optical communication b. Antenna arraying c. Lunar and Mars Relays d. Integrated network management (service management and network control) and integrated service execution e. Enhanced tracking for navigation f. Space internetworking with DTN and IP g. End-to-end security h. Enhanced security services Moreover, the SCaN Program has created an Integrated Network Roadmap that depicts an orchestrated and coherent evolution path toward the target architecture, encompassing all aspects that concern network assets (i.e., operations and maintenance, sustaining engineering, upgrade efforts, and major development). This roadmap identifies major NASA ADPs, and shows dependencies and drivers among the various planned undertakings and timelines. The roadmap is scalable to accommodate timely adjustments in response to Agency needs, goals, objectives and funding. Future challenges to implementing this architecture include balancing user mission needs, technology development, and the availability of funding within NASA s priorities. Strategies for addressing these challenges are to: define a flexible architecture, update the architecture periodically, use ADPs to evaluate options and determine when to make decisions, and to engage the stakeholders in these evaluations. In addition, the SCaN Program will evaluate and respond to mission need dates for technical and operational capabilities to be provided by the SCaN integrated network. In that regard, the architecture defined in this ADD is scalable to accommodate programmatic and technical changes.

  6. Practice and Learning: Spatiotemporal Differences in Thalamo-Cortical-Cerebellar Networks Engagement across Learning Phases in Schizophrenia.

    PubMed

    Korostil, Michele; Remington, Gary; McIntosh, Anthony Randal

    2016-01-01

    Understanding how practice mediates the transition of brain-behavior networks between early and later stages of learning is constrained by the common approach to analysis of fMRI data. Prior imaging studies have mostly relied on a single scan, and parametric, task-related analyses. Our experiment incorporates a multisession fMRI lexicon-learning experiment with multivariate, whole-brain analysis to further knowledge of the distributed networks supporting practice-related learning in schizophrenia (SZ). Participants with SZ were compared with healthy control (HC) participants as they learned a novel lexicon during two fMRI scans over a several day period. All participants were trained to equal task proficiency prior to scanning. Behavioral-Partial Least Squares, a multivariate analytic approach, was used to analyze the imaging data. Permutation testing was used to determine statistical significance and bootstrap resampling to determine the reliability of the findings. With practice, HC participants transitioned to a brain-accuracy network incorporating dorsostriatal regions in late-learning stages. The SZ participants did not transition to this pattern despite comparable behavioral results. Instead, successful learners with SZ were differentiated primarily on the basis of greater engagement of perceptual and perceptual-integration brain regions. There is a different spatiotemporal unfolding of brain-learning relationships in SZ. In SZ, given the same amount of practice, the movement from networks suggestive of effortful learning toward subcortically driven procedural one differs from HC participants. Learning performance in SZ is driven by varying levels of engagement in perceptual regions, which suggests perception itself is impaired and may impact downstream, "higher level" cognition.

  7. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    PubMed

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  8. Permutation coding technique for image recognition systems.

    PubMed

    Kussul, Ernst M; Baidyk, Tatiana N; Wunsch, Donald C; Makeyev, Oleksandr; Martín, Anabel

    2006-11-01

    A feature extractor and neural classifier for image recognition systems are proposed. The proposed feature extractor is based on the concept of random local descriptors (RLDs). It is followed by the encoder that is based on the permutation coding technique that allows to take into account not only detected features but also the position of each feature on the image and to make the recognition process invariant to small displacements. The combination of RLDs and permutation coding permits us to obtain a sufficiently general description of the image to be recognized. The code generated by the encoder is used as an input data for the neural classifier. Different types of images were used to test the proposed image recognition system. It was tested in the handwritten digit recognition problem, the face recognition problem, and the microobject shape recognition problem. The results of testing are very promising. The error rate for the Modified National Institute of Standards and Technology (MNIST) database is 0.44% and for the Olivetti Research Laboratory (ORL) database it is 0.1%.

  9. Rank-based permutation approaches for non-parametric factorial designs.

    PubMed

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  10. A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.

    PubMed

    Xue, Xiaoming; Zhou, Jianzhong

    2017-01-01

    To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Adaptive CT scanning system

    DOEpatents

    Sampayan, Stephen E.

    2016-11-22

    Apparatus, systems, and methods that provide an X-ray interrogation system having a plurality of stationary X-ray point sources arranged to substantially encircle an area or space to be interrogated. A plurality of stationary detectors are arranged to substantially encircle the area or space to be interrogated, A controller is adapted to control the stationary X-ray point sources to emit X-rays one at a time, and to control the stationary detectors to detect the X-rays emitted by the stationary X-ray point sources.

  12. Structure-based Design of Cyclically Permuted HIV-1 gp120 Trimers That Elicit Neutralizing Antibodies*

    PubMed Central

    Kesavardhana, Sannula; Das, Raksha; Citron, Michael; Datta, Rohini; Ecto, Linda; Srilatha, Nonavinakere Seetharam; DiStefano, Daniel; Swoyer, Ryan; Joyce, Joseph G.; Dutta, Somnath; LaBranche, Celia C.; Montefiori, David C.; Flynn, Jessica A.; Varadarajan, Raghavan

    2017-01-01

    A major goal for HIV-1 vaccine development is an ability to elicit strong and durable broadly neutralizing antibody (bNAb) responses. The trimeric envelope glycoprotein (Env) spikes on HIV-1 are known to contain multiple epitopes that are susceptible to bNAbs isolated from infected individuals. Nonetheless, all trimeric and monomeric Env immunogens designed to date have failed to elicit such antibodies. We report the structure-guided design of HIV-1 cyclically permuted gp120 that forms homogeneous, stable trimers, and displays enhanced binding to multiple bNAbs, including VRC01, VRC03, VRC-PG04, PGT128, and the quaternary epitope-specific bNAbs PGT145 and PGDM1400. Constructs that were cyclically permuted in the V1 loop region and contained an N-terminal trimerization domain to stabilize V1V2-mediated quaternary interactions, showed the highest homogeneity and the best antigenic characteristics. In guinea pigs, a DNA prime-protein boost regimen with these new gp120 trimer immunogens elicited potent neutralizing antibody responses against highly sensitive Tier 1A isolates and weaker neutralizing antibody responses with an average titer of about 115 against a panel of heterologous Tier 2 isolates. A modest fraction of the Tier 2 virus neutralizing activity appeared to target the CD4 binding site on gp120. These results suggest that cyclically permuted HIV-1 gp120 trimers represent a viable platform in which further modifications may be made to eventually achieve protective bNAb responses. PMID:27879316

  13. Registry in a tube: multiplexed pools of retrievable parts for genetic design space exploration

    PubMed Central

    Woodruff, Lauren B. A.; Gorochowski, Thomas E.; Roehner, Nicholas; Densmore, Douglas; Gordon, D. Benjamin; Nicol, Robert

    2017-01-01

    Abstract Genetic designs can consist of dozens of genes and hundreds of genetic parts. After evaluating a design, it is desirable to implement changes without the cost and burden of starting the construction process from scratch. Here, we report a two-step process where a large design space is divided into deep pools of composite parts, from which individuals are retrieved and assembled to build a final construct. The pools are built via multiplexed assembly and sequenced using next-generation sequencing. Each pool consists of ∼20 Mb of up to 5000 unique and sequence-verified composite parts that are barcoded for retrieval by PCR. This approach is applied to a 16-gene nitrogen fixation pathway, which is broken into pools containing a total of 55 848 composite parts (71.0 Mb). The pools encompass an enormous design space (1043 possible 23 kb constructs), from which an algorithm-guided 192-member 4.5 Mb library is built. Next, all 1030 possible genetic circuits based on 10 repressors (NOR/NOT gates) are encoded in pools where each repressor is fused to all permutations of input promoters. These demonstrate that multiplexing can be applied to encompass entire design spaces from which individuals can be accessed and evaluated. PMID:28007941

  14. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  15. Brain system for mental orientation in space, time, and person.

    PubMed

    Peer, Michael; Salomon, Roy; Goldberg, Ilan; Blanke, Olaf; Arzy, Shahar

    2015-09-01

    Orientation is a fundamental mental function that processes the relations between the behaving self to space (places), time (events), and person (people). Behavioral and neuroimaging studies have hinted at interrelations between processing of these three domains. To unravel the neurocognitive basis of orientation, we used high-resolution 7T functional MRI as 16 subjects compared their subjective distance to different places, events, or people. Analysis at the individual-subject level revealed cortical activation related to orientation in space, time, and person in a precisely localized set of structures in the precuneus, inferior parietal, and medial frontal cortex. Comparison of orientation domains revealed a consistent order of cortical activity inside the precuneus and inferior parietal lobes, with space orientation activating posterior regions, followed anteriorly by person and then time. Core regions at the precuneus and inferior parietal lobe were activated for multiple orientation domains, suggesting also common processing for orientation across domains. The medial prefrontal cortex showed a posterior activation for time and anterior for person. Finally, the default-mode network, identified in a separate resting-state scan, was active for all orientation domains and overlapped mostly with person-orientation regions. These findings suggest that mental orientation in space, time, and person is managed by a specific brain system with a highly ordered internal organization, closely related to the default-mode network.

  16. Systems Analyze Water Quality in Real Time

    NASA Technical Reports Server (NTRS)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  17. Accelerating acquisition strategies for low-frequency conductivity imaging using MREIT

    NASA Astrophysics Data System (ADS)

    Song, Yizhuang; Seo, Jin Keun; Chauhan, Munish; Indahlastari, Aprinda; Ashok Kumar, Neeta; Sadleir, Rosalind

    2018-02-01

    We sought to improve efficiency of magnetic resonance electrical impedance tomography data acquisition so that fast conductivity changes or electric field variations could be monitored. Undersampling of k-space was used to decrease acquisition times in spin-echo-based sequences by a factor of two. Full MREIT data were reconstructed using continuity assumptions and preliminary scans gathered without current. We found that phase data were reconstructed faithfully from undersampled data. Conductivity reconstructions of phantom data were also possible. Therefore, undersampled k-space methods can potentially be used to accelerate MREIT acquisition. This method could be an advantage in imaging real-time conductivity changes with MREIT.

  18. Multiscale permutation entropy analysis of EEG recordings during sevoflurane anesthesia

    NASA Astrophysics Data System (ADS)

    Li, Duan; Li, Xiaoli; Liang, Zhenhu; Voss, Logan J.; Sleigh, Jamie W.

    2010-08-01

    Electroencephalogram (EEG) monitoring of the effect of anesthetic drugs on the central nervous system has long been used in anesthesia research. Several methods based on nonlinear dynamics, such as permutation entropy (PE), have been proposed to analyze EEG series during anesthesia. However, these measures are still single-scale based and may not completely describe the dynamical characteristics of complex EEG series. In this paper, a novel measure combining multiscale PE information, called CMSPE (composite multi-scale permutation entropy), was proposed for quantifying the anesthetic drug effect on EEG recordings during sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deep anesthesia were used to select the parameters for the multiscale PE analysis: embedding dimension m, lag τ and scales to be integrated into the CMSPE index. Then, the CMSPE index and raw single-scale PE index were applied to EEG recordings from 18 patients who received sevoflurane anesthesia. Pharmacokinetic/pharmacodynamic (PKPD) modeling was used to relate the measured EEG indices and the anesthetic drug concentration. Prediction probability (Pk) statistics and correlation analysis with the response entropy (RE) index, derived from the spectral entropy (M-entropy module; GE Healthcare, Helsinki, Finland), were investigated to evaluate the effectiveness of the new proposed measure. It was found that raw single-scale PE was blind to subtle transitions between light and deep anesthesia, while the CMSPE index tracked these changes accurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidly than the raw PE, with the absolute slopes of linearly fitted response versus time plots of 0.12 (0.09-0.15) and 0.10 (0.06-0.13), respectively. The prediction probability Pk of 0.86 (0.85-0.88) and 0.85 (0.80-0.86) for CMSPE and raw PE indicated that the CMSPE index correlated well with the underlying anesthetic effect. The correlation coefficient for the comparison between the CMSPE index and RE index of 0.84 (0.80-0.88) was significantly higher than the raw PE index of 0.75 (0.66-0.84). The results show that the CMSPE outperforms the raw single-scale PE in reflecting the sevoflurane drug effect on the central nervous system.

  19. Coexpression of Human α- and Circularly Permuted β-Globins Yields a Hemoglobin with Normal R State but Modified T State Properties†

    PubMed Central

    Asmundson, Anna L.; Taber, Alexandria M.; van der Walde, Adella; Lin, Danielle H.; Olson, John S.; Anthony-Cahill, Spencer J.

    2009-01-01

    For the first time, a circularly permuted human β-globin (cpβ) has been coexpressed with human α-globin in bacterial cells and shown to associate to form α-cpβ hemoglobin in solution. Flash photolysis studies of α-cpβ show markedly biphasic CO and O2 kinetics with the amplitudes for the fast association phases being dominant due the presence of large amounts of high-affinity liganded hemoglobin dimers. Extensive dimerization of liganded but not deoxygenated α-cpβ was observed by gel chromatography. The rate constants for O2 and CO binding to the R state forms of α-cpβ are almost identical to those of native HbA (k′R(CO) ≈ 5.0 μM−1 s−1; k′R(O2) ≈ 50 μM−1 s−1), and the rate of O2 dissociation from fully oxygenated α-cpβ is also very similar to that observed for HbA (kR(O2) ≈ 21–28 s−1). When the equilibrium deoxyHb form of α-cpβ is reacted with CO in rapid mixing experiments, the observed time courses are monophasic and the observed bimolecular association rate constant is ∼1.0 μM−1 s−1, which is intermediate between the R state rate measured in partial photolysis experiments (∼5 μM−1 s−1) and that observed for T state deoxyHbA (k′T(CO) ≈ 0.1 to 0.2 μM−1 s−1). Thus the deoxygenated permutated β subunits generate an intermediate, higher affinity, deoxyHb quaternary state. This conclusion is supported by equilibrium oxygen binding measurements in which α-cpβ exhibits a P50 of ∼1.5 mmHg and a low n-value (∼1.3) at pH 7, 20 °C, compared to 8.5 mmHg and n ≈ 2.8 for native HbA under identical, dilute conditions. PMID:19397368

  20. Accelerated self-gated UTE MRI of the murine heart

    NASA Astrophysics Data System (ADS)

    Motaal, Abdallah G.; Noorman, Nils; De Graaf, Wolter L.; Florack, Luc J.; Nicolay, Klaas; Strijkers, Gustav J.

    2014-03-01

    We introduce a new protocol to obtain radial Ultra-Short TE (UTE) MRI Cine of the beating mouse heart within reasonable measurement time. The method is based on a self-gated UTE with golden angle radial acquisition and compressed sensing reconstruction. The stochastic nature of the retrospective triggering acquisition scheme produces an under-sampled and random kt-space filling that allows for compressed sensing reconstruction, hence reducing scan time. As a standard, an intragate multislice FLASH sequence with an acquisition time of 4.5 min per slice was used to produce standard Cine movies of 4 mice hearts with 15 frames per cardiac cycle. The proposed self-gated sequence is used to produce Cine movies with short echo time. The total scan time was 11 min per slice. 6 slices were planned to cover the heart from the base to the apex. 2X, 4X and 6X under-sampled k-spaces cine movies were produced from 2, 1 and 0.7 min data acquisitions for each slice. The accelerated cine movies of the mouse hearts were successfully reconstructed with a compressed sensing algorithm. Compared to the FLASH cine images, the UTE images showed much less flow artifacts due to the short echo time. Besides, the accelerated movies had high image quality and the undersampling artifacts were effectively removed. Left ventricular functional parameters derived from the standard and the accelerated cine movies were nearly identical.

  1. Space Shuttle Orbiter Digital Outer Mold Line Scanning

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.; Wilson, Brad; Pavek, Mike; Berger, Karen

    2012-01-01

    The Space Shuttle Orbiters Discovery and Endeavor have been digitally scanned to produce post-flight configuration outer mold line surfaces. Very detailed scans of the windward side of these vehicles provide resolution of the detailed tile step and gap geometry, as well as the reinforced carbon carbon nose cap and leading edges. Lower resolution scans of the upper surface provide definition of the crew cabin windows, wing upper surfaces, payload bay doors, orbital maneuvering system pods and the vertical tail. The process for acquisition of these digital scans as well as post-processing of the very large data set will be described.

  2. Two-level optimization of composite wing structures based on panel genetic optimization

    NASA Astrophysics Data System (ADS)

    Liu, Boyang

    The design of complex composite structures used in aerospace or automotive vehicles presents a major challenge in terms of computational cost. Discrete choices for ply thicknesses and ply angles leads to a combinatorial optimization problem that is too expensive to solve with presently available computational resources. We developed the following methodology for handling this problem for wing structural design: we used a two-level optimization approach with response-surface approximations to optimize panel failure loads for the upper-level wing optimization. We tailored efficient permutation genetic algorithms to the panel stacking sequence design on the lower level. We also developed approach for improving continuity of ply stacking sequences among adjacent panels. The decomposition approach led to a lower-level optimization of stacking sequence with a given number of plies in each orientation. An efficient permutation genetic algorithm (GA) was developed for handling this problem. We demonstrated through examples that the permutation GAs are more efficient for stacking sequence optimization than a standard GA. Repair strategies for standard GA and the permutation GAs for dealing with constraints were also developed. The repair strategies can significantly reduce computation costs for both standard GA and permutation GA. A two-level optimization procedure for composite wing design subject to strength and buckling constraints is presented. At wing-level design, continuous optimization of ply thicknesses with orientations of 0°, 90°, and +/-45° is performed to minimize weight. At the panel level, the number of plies of each orientation (rounded to integers) and inplane loads are specified, and a permutation genetic algorithm is used to optimize the stacking sequence. The process begins with many panel genetic optimizations for a range of loads and numbers of plies of each orientation. Next, a cubic polynomial response surface is fitted to the optimum buckling load. The resulting response surface is used for wing-level optimization. In general, complex composite structures consist of several laminates. A common problem in the design of such structures is that some plies in the adjacent laminates terminate in the boundary between the laminates. These discontinuities may cause stress concentrations and may increase manufacturing difficulty and cost. We developed measures of continuity of two adjacent laminates. We studied tradeoffs between weight and continuity through a simple composite wing design. Finally, we compared the two-level optimization to a single-level optimization based on flexural lamination parameters. The single-level optimization is efficient and feasible for a wing consisting of unstiffened panels.

  3. Spatiotemporal Analysis of the Ebola Hemorrhagic Fever in West Africa in 2014

    NASA Astrophysics Data System (ADS)

    Xu, M.; Cao, C. X.; Guo, H. F.

    2017-09-01

    Ebola hemorrhagic fever (EHF) is an acute hemorrhagic diseases caused by the Ebola virus, which is highly contagious. This paper aimed to explore the possible gathering area of EHF cases in West Africa in 2014, and identify endemic areas and their tendency by means of time-space analysis. We mapped distribution of EHF incidences and explored statistically significant space, time and space-time disease clusters. We utilized hotspot analysis to find the spatial clustering pattern on the basis of the actual outbreak cases. spatial-temporal cluster analysis is used to analyze the spatial or temporal distribution of agglomeration disease, examine whether its distribution is statistically significant. Local clusters were investigated using Kulldorff's scan statistic approach. The result reveals that the epidemic mainly gathered in the western part of Africa near north Atlantic with obvious regional distribution. For the current epidemic, we have found areas in high incidence of EVD by means of spatial cluster analysis.

  4. Guided Wave Propagation Study on Laminated Composites by Frequency-Wavenumber Technique

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Yu, Lingyu; Leckey, Cara A. C.

    2014-01-01

    Toward the goal of delamination detection and quantification in laminated composites, this paper examines guided wave propagation and wave interaction with delamination damage in laminated carbon fiber reinforced polymer (CFRP) composites using frequency-wavenumber (f-kappa) analysis. Three-dimensional elastodynamic finite integration technique (EFIT) is used to acquire simulated time-space wavefields for a CFRP composite. The time-space wavefields show trapped waves in the delamination region. To unveil the wave propagation physics, the time-space wavefields are further analyzed by using two-dimensional (2D) Fourier transforms (FT). In the analysis results, new f-k components are observed when the incident guided waves interact with the delamination damage. These new f-kappa components in the simulations are experimentally verified through data obtained from scanning laser Doppler vibrometer (SLDV) tests. By filtering the new f-kappa components, delamination damage is detected and quantified.

  5. Age-Related Decline in the Variation of Dynamic Functional Connectivity: A Resting State Analysis.

    PubMed

    Chen, Yuanyuan; Wang, Weiwei; Zhao, Xin; Sha, Miao; Liu, Ya'nan; Zhang, Xiong; Ma, Jianguo; Ni, Hongyan; Ming, Dong

    2017-01-01

    Normal aging is typically characterized by abnormal resting-state functional connectivity (FC), including decreasing connectivity within networks and increasing connectivity between networks, under the assumption that the FC over the scan time was stationary. In fact, the resting-state FC has been shown in recent years to vary over time even within minutes, thus showing the great potential of intrinsic interactions and organization of the brain. In this article, we assumed that the dynamic FC consisted of an intrinsic dynamic balance in the resting brain and was altered with increasing age. Two groups of individuals ( N = 36, ages 20-25 for the young group; N = 32, ages 60-85 for the senior group) were recruited from the public data of the Nathan Kline Institute. Phase randomization was first used to examine the reliability of the dynamic FC. Next, the variation in the dynamic FC and the energy ratio of the dynamic FC fluctuations within a higher frequency band were calculated and further checked for differences between groups by non-parametric permutation tests. The results robustly showed modularization of the dynamic FC variation, which declined with aging; moreover, the FC variation of the inter-network connections, which mainly consisted of the frontal-parietal network-associated and occipital-associated connections, decreased. In addition, a higher energy ratio in the higher FC fluctuation frequency band was observed in the senior group, which indicated the frequency interactions in the FC fluctuations. These results highly supported the basis of abnormality and compensation in the aging brain and might provide new insights into both aging and relevant compensatory mechanisms.

  6. Uniform spacing interrogation of a Fourier domain mode-locked fiber Bragg grating sensor system using a polarization-maintaining fiber Sagnac interferometer

    PubMed Central

    Lee, Hwi Don; Jung, Eun Joo; Jeong, Myung Yung; Chen, Zhongping; Kim, Chang-Seok

    2014-01-01

    A novel linearized interrogation method is presented for a Fourier domain mode-locked (FDML) fiber Bragg grating (FBG) sensor system. In a high speed regime over several tens of kHz modulations, a sinusoidal wave is available to scan the center wavelength of an FDML wavelength-swept laser, instead of a conventional triangular wave. However, sinusoidal wave modulation suffers from an exaggerated non-uniform wavelength-spacing response in demodulating the time-encoded parameter to the absolute wavelength. In this work, the calibration signal from a polarization-maintaining fiber Sagnac interferometer shares the FDML wavelength-swept laser for FBG sensors to convert the time-encoded FBG signal to the wavelength-encoded uniform-spacing signal. PMID:24489440

  7. Direction of Coupling from Phases of Interacting Oscillators: A Permutation Information Approach

    NASA Astrophysics Data System (ADS)

    Bahraminasab, A.; Ghasemi, F.; Stefanovska, A.; McClintock, P. V. E.; Kantz, H.

    2008-02-01

    We introduce a directionality index for a time series based on a comparison of neighboring values. It can distinguish unidirectional from bidirectional coupling, as well as reveal and quantify asymmetry in bidirectional coupling. It is tested on a numerical model of coupled van der Pol oscillators, and applied to cardiorespiratory data from healthy subjects. There is no need for preprocessing and fine-tuning the parameters, which makes the method very simple, computationally fast and robust.

  8. Super-resolution reconstruction of diffusion parameters from diffusion-weighted images with different slice orientations.

    PubMed

    Van Steenkiste, Gwendolyn; Jeurissen, Ben; Veraart, Jelle; den Dekker, Arnold J; Parizel, Paul M; Poot, Dirk H J; Sijbers, Jan

    2016-01-01

    Diffusion MRI is hampered by long acquisition times, low spatial resolution, and a low signal-to-noise ratio. Recently, methods have been proposed to improve the trade-off between spatial resolution, signal-to-noise ratio, and acquisition time of diffusion-weighted images via super-resolution reconstruction (SRR) techniques. However, during the reconstruction, these SRR methods neglect the q-space relation between the different diffusion-weighted images. An SRR method that includes a diffusion model and directly reconstructs high resolution diffusion parameters from a set of low resolution diffusion-weighted images was proposed. Our method allows an arbitrary combination of diffusion gradient directions and slice orientations for the low resolution diffusion-weighted images, optimally samples the q- and k-space, and performs motion correction with b-matrix rotation. Experiments with synthetic data and in vivo human brain data show an increase of spatial resolution of the diffusion parameters, while preserving a high signal-to-noise ratio and low scan time. Moreover, the proposed SRR method outperforms the previous methods in terms of the root-mean-square error. The proposed SRR method substantially increases the spatial resolution of MRI that can be obtained in a clinically feasible scan time. © 2015 Wiley Periodicals, Inc.

  9. Band-to-Band Misregistration of the Images of MODIS On-Board Calibrators and Its Impact to Calibration

    NASA Technical Reports Server (NTRS)

    Wang, Zhipeng; Xiong, Xiaoxiong

    2017-01-01

    The MODIS instruments aboard Terra and Aqua satellites are radiometrically calibrated on-orbit with a set of onboard calibrators (OBC) including a solar diffuser (SD), a blackbody (BB) and a space view (SV) port through which the detectors can view the dark space. As a whisk-broom scanning spectroradiometer, thirty-six MODIS spectral bands are assembled in the along-scan direction on four focal plane assemblies (FPA). These bands capture images of the same target sequentially with the motion of a scan mirror. Then the images are co-registered on board by delaying appropriate band dependent amount of time depending on the band locations on the FPA. While this co-registration mechanism is functioning well for the "far field" remote targets such as Earth view (EV) scenes or the Moon, noticeable band-to-band misregistration in the along-scan direction has been observed for near field targets, in particular the OBCs. In this paper, the misregistration phenomenon is presented and analyzed. It is concluded that the root cause of the misregistration is that the rotating element of the instrument, the scan mirror, is displaced from the focus of the telescope primary mirror. The amount of the misregistration is proportional to the band location on the FPA and is inversely proportional to the distance between the target and the scan mirror. The impact of this misregistration to the calibration of MODIS bands is discussed. In particular, the calculation of the detector gain coefficient m1 of bands 8-16 (412 nm 870 nm) is improved by up to 1.5% for Aqua MODIS.

  10. Band-to-Band Misregistration of the Images of MODIS Onboard Calibrators and Its Impact on Calibration

    NASA Technical Reports Server (NTRS)

    Wang, Zhipeng; Xiong, Xiaoxiong

    2017-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard Terra and Aqua satellites are radiometrically calibrated on-orbit with a set of onboard calibrators (OBCs), including a solar diffuser, a blackbody, and a space view port through which the detectors can view the dark space. As a whisk-broom scanning spectroradiometer, thirty-six MODIS spectral bands are assembled in the along-scan direction on four focal plane assemblies (FPAs). These bands capture images of the same target sequentially with the motion of a scan mirror. Then the images are coregistered onboard by delaying the appropriate band-dependent amount of time, depending on the band locations on the FPA. While this coregistration mechanismis functioning well for the far-field remote targets such as earth view scenes or the moon, noticeable band-to-band misregistration in the along-scan direction has been observed for near field targets, particularly in OBCs. In this paper, the misregistration phenomenon is presented and analyzed. It is concluded that the root cause of the misregistration is that the rotating element of the instrument, the scan mirror, is displaced from the focus of the telescope primary mirror. The amount of the misregistrationis proportional to the band location on the FPA and is inversely proportional to the distance between the target and the scan mirror. The impact of this misregistration on the calibration of MODIS bands is discussed. In particular, the calculation of the detector gain coefficient m1of bands 8-16 (412 nm 870 nm) is improved by up to 1.5% for Aqua MODIS.

  11. Experimenting Galileo on Board the International Space Station

    NASA Technical Reports Server (NTRS)

    Fantinato, Samuele; Pozzobon, Oscar; Gamba, Giovanni; Chiara, Andrea Dalla; Montagner, Stefano; Giordano, Pietro; Crisci, Massimo; Enderle, Werner; Chelmins, David T.; Sands, Obed S.; hide

    2016-01-01

    The SCaN Testbed is an advanced integrated communications system and laboratory facility installed on the International Space Station (ISS) in 2012. The testbed incorporates a set of new generation of Software Defined Radio (SDR) technologies intended to allow researchers to develop, test, and demonstrate new communications, networking, and navigation capabilities in the actual environment of space. Qascom, in cooperation with ESA and NASA, is designing a Software Defined Radio GalileoGPS Receiver capable to provide accurate positioning and timing to be installed on the ISS SCaN Testbed. The GalileoGPS waveform will be operated in the JPL SDR that is constituted by several hardware components that can be used for experimentations in L-Band and S-Band. The JPL SDR includes an L-Band Dorne Margolin antenna mounted onto a choke ring. The antenna is connected to a radio front end capable to provide one bit samples for the three GNSS frequencies (L1, L2 and L5) at 38 MHz, exploiting the subharmonic sampling. The baseband processing is then performed by an ATMEL AT697 processor (100 MIPS) and two Virtex 2 FPGAs. The JPL SDR supports the STRS (Space Telecommunications Radio System) that provides common waveform software interfaces, methods of instantiation, operation, and testing among different compliant hardware and software products. The standard foresees the development of applications that are modular, portable, reconfigurable, and reusable. The developed waveform uses the STRS infrastructure-provided application program interfaces (APIs) and services to load, verify, execute, change parameters, terminate, or unload an application. The project is divided in three main phases. 1)Design and Development of the GalileoGPS waveform for the SCaN Testbed starting from Qascom existing GNSS SDR receiver. The baseline design is limited to the implementation of the single frequency Galileo and GPS L1E1 receiver even if as part of the activity it will be to assess the feasibility of a dual frequency implementation (L1E1+L5E5a) in the same SDR platform.2)Qualification and test the GalileoGPS waveform using ground systems available at the NASA Glenn Research Center. Experimenters can have access to two SCaN Testbed ground based systems for development and verification: the Experimenter Development System (EDS) that is intended to provide initial opportunity for software testing and basic functional validation and the Ground Integration Unit (GIU) that is a high fidelity version of the SCaN Testbed flight system and is therefore used for more controlled final development testing and verification testing.3)Perform in-orbit validation and experimentation: The experimentation phase will consists on the collection of raw measurements (pseudorange, Carrier phase, CN0) in space, assessment on the quality of the measurements and the receiver performances in terms of signal acquisition, tracking, etc. Finally computation of positioning in space (Position, Velocity and time) and assessment of its performance.(Complete abstract in attached document).

  12. Lessons Learned in the First Year Operating Software Defined Radios in Space

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Mortensen, Dale; Shalkhauser, Mary Jo; Johnson, Sandra K.; Reinhart, Richard

    2014-01-01

    Operating three unique software defined radios (SDRs) in a space environment aboard the Space Communications and Navigation (SCaN) Testbed for over one year has provided an opportunity to gather knowledge useful for future missions considering using software defined radios. This paper provides recommendations for the development and use of SDRs, and it considers the details of each SDRs approach to software upgrades and operation. After one year, the SCaN Testbed SDRs have operated for over 1000 hours. During this time, the waveforms launched with the SDR were tested on-orbit to assure that they operated in space at the same performance level as on the ground prior to launch to obtain an initial on-orbit performance baseline. A new waveform for each SDR has been developed, implemented, uploaded to the flight system, and tested in the flight environment. Recommendations for SDR-based missions have been gathered from early development through operations. These recommendations will aid future missions to reduce the cost, schedule, and risk of operating SDRs in a space environment. This paper considers the lessons learned as they apply to SDR pre-launch checkout, purchasing space-rated hardware, flexibility in command and telemetry methods, on-orbit diagnostics, use of engineering models to aid future development, and third-party software. Each SDR implements the SCaN Testbed flight computer command and telemetry interface uniquely, allowing comparisons to be drawn. The paper discusses the lessons learned from these three unique implementations, with suggestions on the preferred approach. Also, results are presented showing that it is important to have full system performance knowledge prior to launch to establish better performance baselines in space, requiring additional test applications to be developed pre-launch. Finally, the paper presents the issues encountered with the operation and implementation of new waveforms on each SDR and proposes recommendations to avoid these issues.

  13. Lessons Learned in the First Year Operating Software Defined Radios in Space

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Mortensen, Dale; Shalkhauser, Mary Jo; Johnson, Sandra K.; Reinhart, Richard

    2014-01-01

    Operating three unique software defined radios (SDRs) in a space environment aboard the Space Communications and Navigation (SCaN) Testbed for over one year has provided an opportunity to gather knowledge useful for future missions considering using software defined radios. This paper provides recommendations for the development and use of SDRs, and it considers the details of each SDR's approach to software upgrades and operation. After one year, the SCaN Testbed SDRs have operated for over 1000 hours. During this time, the waveforms launched with the SDR were tested on-orbit to assure that they operated in space at the same performance level as on the ground prior to launch to obtain an initial on-orbit performance baseline. A new waveform for each SDR has been developed, implemented, uploaded to the flight system, and tested in the flight environment. Recommendations for SDR-based missions have been gathered from early development through operations. These recommendations will aid future missions to reduce the cost, schedule, and risk of operating SDRs in a space environment. This paper considers the lessons learned as they apply to SDR pre-launch checkout, purchasing space-rated hardware, flexibility in command and telemetry methods, on-orbit diagnostics, use of engineering models to aid future development, and third-party software. Each SDR implements the SCaN Testbed flight computer command and telemetry interface uniquely, allowing comparisons to be drawn. The paper discusses the lessons learned from these three unique implementations, with suggestions on the preferred approach. Also, results are presented showing that it is important to have full system performance knowledge prior to launch to establish better performance baselines in space, requiring additional test applications to be developed pre-launch. Finally, the paper presents the issues encountered with the operation and implementation of new waveforms on each SDR and proposes recommendations to avoid these issues.

  14. Stable and efficient retrospective 4D-MRI using non-uniformly distributed quasi-random numbers

    NASA Astrophysics Data System (ADS)

    Breuer, Kathrin; Meyer, Cord B.; Breuer, Felix A.; Richter, Anne; Exner, Florian; Weng, Andreas M.; Ströhle, Serge; Polat, Bülent; Jakob, Peter M.; Sauer, Otto A.; Flentje, Michael; Weick, Stefan

    2018-04-01

    The purpose of this work is the development of a robust and reliable three-dimensional (3D) Cartesian imaging technique for fast and flexible retrospective 4D abdominal MRI during free breathing. To this end, a non-uniform quasi random (NU-QR) reordering of the phase encoding (k y –k z ) lines was incorporated into 3D Cartesian acquisition. The proposed sampling scheme allocates more phase encoding points near the k-space origin while reducing the sampling density in the outer part of the k-space. Respiratory self-gating in combination with SPIRiT-reconstruction is used for the reconstruction of abdominal data sets in different respiratory phases (4D-MRI). Six volunteers and three patients were examined at 1.5 T during free breathing. Additionally, data sets with conventional two-dimensional (2D) linear and 2D quasi random phase encoding order were acquired for the volunteers for comparison. A quantitative evaluation of image quality versus scan times (from 70 s to 626 s) for the given sampling schemes was obtained by calculating the normalized mutual information (NMI) for all volunteers. Motion estimation was accomplished by calculating the maximum derivative of a signal intensity profile of a transition (e.g. tumor or diaphragm). The 2D non-uniform quasi-random distribution of phase encoding lines in Cartesian 3D MRI yields more efficient undersampling patterns for parallel imaging compared to conventional uniform quasi-random and linear sampling. Median NMI values of NU-QR sampling are the highest for all scan times. Therefore, within the same scan time 4D imaging could be performed with improved image quality. The proposed method allows for the reconstruction of motion artifact reduced 4D data sets with isotropic spatial resolution of 2.1  ×  2.1  ×  2.1 mm3 in a short scan time, e.g. 10 respiratory phases in only 3 min. Cranio-caudal tumor displacements between 23 and 46 mm could be observed. NU-QR sampling enables for stable 4D-MRI with high temporal and spatial resolution within short scan time for visualization of organ or tumor motion during free breathing. Further studies, e.g. the application of the method for radiotherapy planning are needed to investigate the clinical applicability and diagnostic value of the approach.

  15. Graph Theory Meets Ab Initio Molecular Dynamics: Atomic Structures and Transformations at the Nanoscale

    NASA Astrophysics Data System (ADS)

    Pietrucci, Fabio; Andreoni, Wanda

    2011-08-01

    Social permutation invariant coordinates are introduced describing the bond network around a given atom. They originate from the largest eigenvalue and the corresponding eigenvector of the contact matrix, are invariant under permutation of identical atoms, and bear a clear signature of an order-disorder transition. Once combined with ab initio metadynamics, these coordinates are shown to be a powerful tool for the discovery of low-energy isomers of molecules and nanoclusters as well as for a blind exploration of isomerization, association, and dissociation reactions.

  16. Finding fixed satellite service orbital allotments with a k-permutation algorithm

    NASA Technical Reports Server (NTRS)

    Reilly, Charles H.; Mount-Campbell, Clark A.; Gonsalvez, David J. A.

    1990-01-01

    A satellite system synthesis problem, the satellite location problem (SLP), is addressed. In SLP, orbital locations (longitudes) are allotted to geostationary satellites in the fixed satellite service. A linear mixed-integer programming model is presented that views SLP as a combination of two problems: the problem of ordering the satellites and the problem of locating the satellites given some ordering. A special-purpose heuristic procedure, a k-permutation algorithm, has been developed to find solutions to SLPs. Solutions to small sample problems are presented and analyzed on the basis of calculated interferences.

  17. Magic informationally complete POVMs with permutations

    NASA Astrophysics Data System (ADS)

    Planat, Michel; Gedik, Zafer

    2017-09-01

    Eigenstates of permutation gates are either stabilizer states (for gates in the Pauli group) or magic states, thus allowing universal quantum computation (Planat, Rukhsan-Ul-Haq 2017 Adv. Math. Phys. 2017, 5287862 (doi:10.1155/2017/5287862)). We show in this paper that a subset of such magic states, when acting on the generalized Pauli group, define (asymmetric) informationally complete POVMs. Such informationally complete POVMs, investigated in dimensions 2-12, exhibit simple finite geometries in their projector products and, for dimensions 4 and 8 and 9, relate to two-qubit, three-qubit and two-qutrit contextuality.

  18. Functional linear models to test for differences in prairie wetland hydraulic gradients

    USGS Publications Warehouse

    Greenwood, Mark C.; Sojda, Richard S.; Preston, Todd M.; Swayne, David A.; Yang, Wanhong; Voinov, A.A.; Rizzoli, A.; Filatova, T.

    2010-01-01

    Functional data analysis provides a framework for analyzing multiple time series measured frequently in time, treating each series as a continuous function of time. Functional linear models are used to test for effects on hydraulic gradient functional responses collected from three types of land use in Northeastern Montana at fourteen locations. Penalized regression-splines are used to estimate the underlying continuous functions based on the discretely recorded (over time) gradient measurements. Permutation methods are used to assess the statistical significance of effects. A method for accommodating missing observations in each time series is described. Hydraulic gradients may be an initial and fundamental ecosystem process that responds to climate change. We suggest other potential uses of these methods for detecting evidence of climate change.

  19. Long Duration X-ray Bursts Observed by MAXI

    NASA Astrophysics Data System (ADS)

    Serino, Motoko; Iwakiri, Wataru; Tamagawa, Toru; Sakamoto, Takanori; Nakahira, Satoshi; Matsuoka, Masaru; Yamaoka, Kazutaka; Negoro, Hitoshi

    Monitor of All-sky X-ray Image (MAXI) is X-ray mission on the International Space Station. MAXI scans all sky every 92 min and detects various X-ray transient events including X-ray bursts. Among the X-ray bursts observed by MAXI, eleven had long duration and were observed more than one scan. Six out of eleven long bursts have the e-folding time of >1 h, that should be classified as "superbursts", while the rest are "intermediate-duration bursts". The total emitted energy of these long X-ray bursts range from 1041 to 1042 ergs. The lower limits of the superburst recurrence time of 4U 0614+091 and Ser X-1 are calculated as 4400 and 59 days, which may be consistent with the observed recurrence time of 3523 and 1148 days, respectively.

  20. 4D spiral imaging of flows in stenotic phantoms and subjects with aortic stenosis.

    PubMed

    Negahdar, M J; Kadbi, Mo; Kendrick, Michael; Stoddard, Marcus F; Amini, Amir A

    2016-03-01

    The utility of four-dimensional (4D) spiral flow in imaging of stenotic flows in both phantoms and human subjects with aortic stenosis is investigated. The method performs 4D flow acquisitions through a stack of interleaved spiral k-space readouts. Relative to conventional 4D flow, which performs Cartesian readout, the method has reduced echo time. Thus, reduced flow artifacts are observed when imaging high-speed stenotic flows. Four-dimensional spiral flow also provides significant savings in scan times relative to conventional 4D flow. In vitro experiments were performed under both steady and pulsatile flows in a phantom model of severe stenosis (one inch diameter at the inlet, with 87% area reduction at the throat of the stenosis) while imaging a 6-cm axial extent of the phantom, which included the Gaussian-shaped stenotic narrowing. In all cases, gradient strength and slew rate for standard clinical acquisitions, and identical field of view and resolution were used. For low steady flow rates, quantitative and qualitative results showed a similar level of accuracy between 4D spiral flow (echo time [TE] = 2 ms, scan time = 40 s) and conventional 4D flow (TE = 3.6 ms, scan time = 1:01 min). However, in the case of high steady flow rates, 4D spiral flow (TE = 1.57 ms, scan time = 38 s) showed better visualization and accuracy as compared to conventional 4D flow (TE = 3.2 ms, scan time = 51 s). At low pulsatile flow rates, a good agreement was observed between 4D spiral flow (TE = 2 ms, scan time = 10:26 min) and conventional 4D flow (TE = 3.6 ms, scan time = 14:20 min). However, in the case of high flow-rate pulsatile flows, 4D spiral flow (TE = 1.57 ms, scan time = 10:26 min) demonstrated better visualization as compared to conventional 4D flow (TE = 3.2 ms, scan time = 14:20 min). The feasibility of 4D spiral flow was also investigated in five normal volunteers and four subjects with mild-to-moderate aortic stenosis. The approach achieved TE = 1.68 ms and scan time = 3:44 min. The conventional sequence achieved TE = 2.9 ms and scan time = 5:23 min. In subjects with aortic stenosis, we also compared both MRI methods with Doppler ultrasound (US) in the measurement of peak velocity, time to peak systolic velocity, and eject time. Bland-Altman analysis revealed that, when comparing peak velocities, the discrepancy between Doppler US and 4D spiral flow was significantly less than the discrepancy between Doppler and 4D Cartesian flow (2.75 cm/s vs. 10.25 cm/s), whereas the two MR methods were comparable (-5.75 s vs. -6 s) for time to peak. However, for the estimation of eject time, relative to Doppler US, the discrepancy for 4D conventional flow was smaller than that of 4D spiral flow (-16.25 s vs. -20 s). Relative to conventional 4D flow, 4D spiral flow achieves substantial reductions in both the TE and scan times; therefore, utility for it should be sought in a variety of in vivo and complex flow imaging applications. © 2015 Wiley Periodicals, Inc.

  1. High-Performance Scanning Acousto-Ultrasonic System

    NASA Technical Reports Server (NTRS)

    Roth, Don; Martin, Richard; Kautz, Harold; Cosgriff, Laura; Gyekenyesi, Andrew

    2006-01-01

    A high-performance scanning acousto-ultrasonic system, now undergoing development, is designed to afford enhanced capabilities for imaging microstructural features, including flaws, inside plate specimens of materials. The system is expected to be especially helpful in analyzing defects that contribute to failures in polymer- and ceramic-matrix composite materials, which are difficult to characterize by conventional scanning ultrasonic techniques and other conventional nondestructive testing techniques. Selected aspects of the acousto-ultrasonic method have been described in several NASA Tech Briefs articles in recent years. Summarizing briefly: The acousto-ultrasonic method involves the use of an apparatus like the one depicted in the figure (or an apparatus of similar functionality). Pulses are excited at one location on a surface of a plate specimen by use of a broadband transmitting ultrasonic transducer. The stress waves associated with these pulses propagate along the specimen to a receiving transducer at a different location on the same surface. Along the way, the stress waves interact with the microstructure and flaws present between the transducers. The received signal is analyzed to evaluate the microstructure and flaws. The specific variant of the acousto-ultrasonic method implemented in the present developmental system goes beyond the basic principle described above to include the following major additional features: Computer-controlled motorized translation stages are used to automatically position the transducers at specified locations. Scanning is performed in the sense that the measurement, data-acquisition, and data-analysis processes are repeated at different specified transducer locations in an array that spans the specimen surface (or a specified portion of the surface). A pneumatic actuator with a load cell is used to apply a controlled contact force. In analyzing the measurement data for each pair of transducer locations in the scan, the total (multimode) acousto-ultrasonic response of the specimen is utilized. The analysis is performed by custom software that extracts parameters of signals in the time and frequency domains. The computer hardware and software provide both real-time and postscan processing and display options. For example, oscilloscope displays of waveforms and power spectral densities are available in real time. Images can be computed while scanning continues. Signals can be digitally preprocessed and/or post-processed by filtering, windowing, time-segmenting, and running-waveform-averaging algorithms. In addition, the software affords options for off-line simulation of the waveform-data-acquisition and scanning processes. In tests, the system has been shown to be capable of characterizing microstructural changes and defects in SiC/SiC and C/SiC ceramic-matrix composites. Delaminations, variations in density, microstructural changes attributable to infiltration by silicon, and crack-space indications (defined in the next sentence) have been revealed in images formed from several time- and frequency-domain parameters of scanning acousto-ultrasonic signals. The crack-space indications were image features that were not revealed by other nondestructive testing methods and are so named because they turned out to mark locations where cracking eventually occurred.

  2. Space-time variations in child mortality in a rural South African population with high HIV prevalence (2000-2014).

    PubMed

    Tlou, Boikhutso; Sartorius, Benn; Tanser, Frank

    2017-01-01

    The aim of the study was to identify the key determinants of child mortality 'hot-spots' in space and time. Comprehensive population-based mortality data collected between 2000 and 2014 by the Africa Centre Demographic Information System located in the UMkhanyakude District of KwaZulu-Natal Province, South Africa, was analysed. We assigned all mortality events and person-time of observation for children <5 years of age to an exact homestead of residence (mapped to <2m accuracy as part of the DSA platform). Using these exact locations, both the Kulldorff and Tango spatial scan statistics for regular and irregular shaped cluster detection were used to identify clusters of childhood mortality events in both space and time. Of the 49 986 children aged < 5 years who resided in the study area between 2000 and 2014, 2010 (4.0%) died. Childhood mortality decreased by 80% over the period from >20 per 1000 person-years in 2001-2003 to 4 per 1000 person-years in 2014. The two scanning spatial techniques identified two high-risk clusters for child mortality along the eastern border of the study site near the national highway, with a relative risk of 2.10 and 1.91 respectively. The high-risk communities detected in this work, and the differential risk factor profile of these communities, can assist public health professionals to identify similar populations in other parts of rural South Africa. Identifying child mortality hot-spots will potentially guide policy interventions in rural, resource-limited settings.

  3. Research Performed within the Non-Destructive Evaluation Team at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Burns, Erin A.

    2004-01-01

    Non-destructive testing is essential in many fields of manufacturing and research in order to perform reliable examination of potentially damaged materials and parts without destroying the inherent structure of the materials. Thus, the Non-Destructive Evaluation (NDE) Team at NASA Glenn Research Center partakes in various projects to improve materials testing equipment as well as analyze materials, material defects, and material deficiencies. Due to the array of projects within the NDE Team at this time, five research aims were supplemental to some current projects. A literature survey of "DE and testing methodologies as related to rocks was performed. Also, Mars Expedition Rover technology was assessed to understand the requirements for instrumentation in harsh space environments (e.g. temperature). Potential instrumentation and technologies were also considered and documented. The literature survey provided background and potential sources for a proposal to acquire funding for ultrasonic instrumentation on board a future Mars expedition. The laboratory uses a Santec Systems AcousticScope AS200 acoustography system. Labview code was written within the current program in order to improve the current performance of the acoustography system. A sample of Reinforced Carbon/Carbon (RCC) material from the leading edge of the space shuttle underwent various non-destructive tests (guided wave scanning, thermography, computed tomography, real time x-ray, etc.) in order to characterize its structure and examine possible defects. Guided wave scan data of a ceramic matrix composite (CMC) panel was reanalyzed utilizing image correlations and signal processing variables. Additional guided wave scans and thermography were also performed on the CMC panel. These reevaluated data and images will be used in future presentations and publications. An additional axis for the guided wave scanner was designed, constructed, and implemented. This additional axis allowed incremental spacing of the previously fixed transducers for ultrasonic velocity measurements.

  4. Optimised design of a ROTAX-type instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tietze-Jaensch, H.

    1997-09-01

    The rotating analyser (ROTAX) spectrometer has been devised and installed at ISIS. Practical scans in (Q{h_bar}{omega}) space with a nearly arbitrary scan direction, i.e. polarisation of q vs. Q are possible and feasible with no compromises on the resolution. Valuable technological and methodological knowledge has been compiled for an improved version of such a type of instrument. At present ROTAX lacks competitiveness with other spectrometers from an unexpectedly weak neutron flux of its particular beam-line and an unfavourable adaption of the analyser`s drive power to the time frame or neutron source frequency.

  5. Insights on correlation dimension from dynamics mapping of three experimental nonlinear laser systems.

    PubMed

    McMahon, Christopher J; Toomey, Joshua P; Kane, Deb M

    2017-01-01

    We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the 'minimum gradient detection algorithm'. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Applying the new 'minimum gradient detection algorithm' CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements.

  6. Insights on correlation dimension from dynamics mapping of three experimental nonlinear laser systems

    PubMed Central

    McMahon, Christopher J.; Toomey, Joshua P.

    2017-01-01

    Background We have analysed large data sets consisting of tens of thousands of time series from three Type B laser systems: a semiconductor laser in a photonic integrated chip, a semiconductor laser subject to optical feedback from a long free-space-external-cavity, and a solid-state laser subject to optical injection from a master laser. The lasers can deliver either constant, periodic, pulsed, or chaotic outputs when parameters such as the injection current and the level of external perturbation are varied. The systems represent examples of experimental nonlinear systems more generally and cover a broad range of complexity including systematically varying complexity in some regions. Methods In this work we have introduced a new procedure for semi-automatically interrogating experimental laser system output power time series to calculate the correlation dimension (CD) using the commonly adopted Grassberger-Proccacia algorithm. The new CD procedure is called the ‘minimum gradient detection algorithm’. A value of minimum gradient is returned for all time series in a data set. In some cases this can be identified as a CD, with uncertainty. Findings Applying the new ‘minimum gradient detection algorithm’ CD procedure, we obtained robust measurements of the correlation dimension for many of the time series measured from each laser system. By mapping the results across an extended parameter space for operation of each laser system, we were able to confidently identify regions of low CD (CD < 3) and assign these robust values for the correlation dimension. However, in all three laser systems, we were not able to measure the correlation dimension at all parts of the parameter space. Nevertheless, by mapping the staged progress of the algorithm, we were able to broadly classify the dynamical output of the lasers at all parts of their respective parameter spaces. For two of the laser systems this included displaying regions of high-complexity chaos and dynamic noise. These high-complexity regions are differentiated from regions where the time series are dominated by technical noise. This is the first time such differentiation has been achieved using a CD analysis approach. Conclusions More can be known of the CD for a system when it is interrogated in a mapping context, than from calculations using isolated time series. This has been shown for three laser systems and the approach is expected to be useful in other areas of nonlinear science where large data sets are available and need to be semi-automatically analysed to provide real dimensional information about the complex dynamics. The CD/minimum gradient algorithm measure provides additional information that complements other measures of complexity and relative complexity, such as the permutation entropy; and conventional physical measurements. PMID:28837602

  7. Simultaneous auto-calibration and gradient delays estimation (SAGE) in non-Cartesian parallel MRI using low-rank constraints.

    PubMed

    Jiang, Wenwen; Larson, Peder E Z; Lustig, Michael

    2018-03-09

    To correct gradient timing delays in non-Cartesian MRI while simultaneously recovering corruption-free auto-calibration data for parallel imaging, without additional calibration scans. The calibration matrix constructed from multi-channel k-space data should be inherently low-rank. This property is used to construct reconstruction kernels or sensitivity maps. Delays between the gradient hardware across different axes and RF receive chain, which are relatively benign in Cartesian MRI (excluding EPI), lead to trajectory deviations and hence data inconsistencies for non-Cartesian trajectories. These in turn lead to higher rank and corrupted calibration information which hampers the reconstruction. Here, a method named Simultaneous Auto-calibration and Gradient delays Estimation (SAGE) is proposed that estimates the actual k-space trajectory while simultaneously recovering the uncorrupted auto-calibration data. This is done by estimating the gradient delays that result in the lowest rank of the calibration matrix. The Gauss-Newton method is used to solve the non-linear problem. The method is validated in simulations using center-out radial, projection reconstruction and spiral trajectories. Feasibility is demonstrated on phantom and in vivo scans with center-out radial and projection reconstruction trajectories. SAGE is able to estimate gradient timing delays with high accuracy at a signal to noise ratio level as low as 5. The method is able to effectively remove artifacts resulting from gradient timing delays and restore image quality in center-out radial, projection reconstruction, and spiral trajectories. The low-rank based method introduced simultaneously estimates gradient timing delays and provides accurate auto-calibration data for improved image quality, without any additional calibration scans. © 2018 International Society for Magnetic Resonance in Medicine.

  8. Free-space wavelength-multiplexed optical scanner.

    PubMed

    Yaqoob, Z; Rizvi, A A; Riza, N A

    2001-12-10

    A wavelength-multiplexed optical scanning scheme is proposed for deflecting a free-space optical beam by selection of the wavelength of the light incident on a wavelength-dispersive optical element. With fast tunable lasers or optical filters, this scanner features microsecond domain scan setting speeds and large- diameter apertures of several centimeters or more for subdegree angular scans. Analysis performed indicates an optimum scan range for a given diffraction order and grating period. Limitations include beam-spreading effects based on the varying scanner aperture sizes and the instantaneous information bandwidth of the data-carrying laser beam.

  9. Estimation of the Mean Axon Diameter and Intra-axonal Space Volume Fraction of the Human Corpus Callosum: Diffusion q-space Imaging with Low q-values.

    PubMed

    Suzuki, Yuriko; Hori, Masaaki; Kamiya, Kouhei; Fukunaga, Issei; Aoki, Shigeki; VAN Cauteren, Marc

    2016-01-01

    Q-space imaging (QSI) is a diffusion-weighted imaging (DWI) technique that enables investigation of tissue microstructure. However, for sufficient displacement resolution to measure the microstructure, QSI requires high q-values that are usually difficult to achieve with a clinical scanner. The recently introduced "low q-value method" fits the echo attenuation to only low q-values to extract the root mean square displacement. We investigated the clinical feasibility of the low q-value method for estimating the microstructure of the human corpus callosum using a 3.0-tesla clinical scanner within a clinically feasible scan time. We performed a simulation to explore the acceptable range of maximum q-values for the low q-value method. We simulated echo attenuations caused by restricted diffusion in the intra-axonal space (IAS) and hindered diffusion in the extra-axonal space (EAS) assuming 100,000 cylinders with various diameters, and we estimated mean axon diameter, IAS volume fraction, and EAS diffusivity by fitting echo attenuations with different maximum q-values. Furthermore, we scanned the corpus callosum of 7 healthy volunteers and estimated the mean axon diameter and IAS volume fraction. Good agreement between estimated and defined values in the simulation study with maximum q-values of 700 and 800 cm(-1) suggested that the maximum q-value used in the in vivo experiment, 737 cm(-1), was reasonable. In the in vivo experiment, the mean axon diameter was larger in the body of the corpus callosum and smaller in the genu and splenium, and this anterior-to-posterior trend is consistent with previously reported histology, although our mean axon diameter seems larger in size. On the other hand, we found an opposite anterior-to-posterior trend, with high IAS volume fraction in the genu and splenium and a lower fraction in the body, which is similar to the fiber density reported in the histology study. The low q-value method may provide insights into tissue microstructure using a 3T clinical scanner within clinically feasible scan time.

  10. Genetic variations in the serotonergic system contribute to amygdala volume in humans

    PubMed Central

    Li, Jin; Chen, Chunhui; Wu, Karen; Zhang, Mingxia; Zhu, Bi; Chen, Chuansheng; Moyzis, Robert K.; Dong, Qi

    2015-01-01

    The amygdala plays a critical role in emotion processing and psychiatric disorders associated with emotion dysfunction. Accumulating evidence suggests that amygdala structure is modulated by serotonin-related genes. However, there is a gap between the small contributions of single loci (less than 1%) and the reported 63–65% heritability of amygdala structure. To understand the “missing heritability,” we systematically explored the contribution of serotonin genes on amygdala structure at the gene set level. The present study of 417 healthy Chinese volunteers examined 129 representative polymorphisms in genes from multiple biological mechanisms in the regulation of serotonin neurotransmission. A system-level approach using multiple regression analyses identified that nine SNPs collectively accounted for approximately 8% of the variance in amygdala volume. Permutation analyses showed that the probability of obtaining these findings by chance was low (p = 0.043, permuted for 1000 times). Findings showed that serotonin genes contribute moderately to individual differences in amygdala volume in a healthy Chinese sample. These results indicate that the system-level approach can help us to understand the genetic basis of a complex trait such as amygdala structure. PMID:26500508

  11. Efficient identification of context dependent subgroups of risk from genome wide association studies

    PubMed Central

    Dyson, Greg; Sing, Charles F.

    2014-01-01

    We have developed a modified Patient Rule-Induction Method (PRIM) as an alternative strategy for analyzing representative samples of non-experimental human data to estimate and test the role of genomic variations as predictors of disease risk in etiologically heterogeneous sub-samples. A computational limit of the proposed strategy is encountered when the number of genomic variations (predictor variables) under study is large (> 500) because permutations are used to generate a null distribution to test the significance of a term (defined by values of particular variables) that characterizes a sub-sample of individuals through the peeling and pasting processes. As an alternative, in this paper we introduce a theoretical strategy that facilitates the quick calculation of Type I and Type II errors in the evaluation of terms in the peeling and pasting processes carried out in the execution of a PRIM analysis that are underestimated and non-existent, respectively, when a permutation-based hypothesis test is employed. The resultant savings in computational time makes possible the consideration of larger numbers of genomic variations (an example genome wide association study is given) in the selection of statistically significant terms in the formulation of PRIM prediction models. PMID:24570412

  12. A Novel Bearing Multi-Fault Diagnosis Approach Based on Weighted Permutation Entropy and an Improved SVM Ensemble Classifier.

    PubMed

    Zhou, Shenghan; Qian, Silin; Chang, Wenbing; Xiao, Yiyong; Cheng, Yang

    2018-06-14

    Timely and accurate state detection and fault diagnosis of rolling element bearings are very critical to ensuring the reliability of rotating machinery. This paper proposes a novel method of rolling bearing fault diagnosis based on a combination of ensemble empirical mode decomposition (EEMD), weighted permutation entropy (WPE) and an improved support vector machine (SVM) ensemble classifier. A hybrid voting (HV) strategy that combines SVM-based classifiers and cloud similarity measurement (CSM) was employed to improve the classification accuracy. First, the WPE value of the bearing vibration signal was calculated to detect the fault. Secondly, if a bearing fault occurred, the vibration signal was decomposed into a set of intrinsic mode functions (IMFs) by EEMD. The WPE values of the first several IMFs were calculated to form the fault feature vectors. Then, the SVM ensemble classifier was composed of binary SVM and the HV strategy to identify the bearing multi-fault types. Finally, the proposed model was fully evaluated by experiments and comparative studies. The results demonstrate that the proposed method can effectively detect bearing faults and maintain a high accuracy rate of fault recognition when a small number of training samples are available.

  13. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  14. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  15. Algorithms and programming tools for image processing on the MPP:3

    NASA Technical Reports Server (NTRS)

    Reeves, Anthony P.

    1987-01-01

    This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.

  16. A power analysis for multivariate tests of temporal trend in species composition.

    PubMed

    Irvine, Kathryn M; Dinger, Eric C; Sarr, Daniel

    2011-10-01

    Long-term monitoring programs emphasize power analysis as a tool to determine the sampling effort necessary to effectively document ecologically significant changes in ecosystems. Programs that monitor entire multispecies assemblages require a method for determining the power of multivariate statistical models to detect trend. We provide a method to simulate presence-absence species assemblage data that are consistent with increasing or decreasing directional change in species composition within multiple sites. This step is the foundation for using Monte Carlo methods to approximate the power of any multivariate method for detecting temporal trends. We focus on comparing the power of the Mantel test, permutational multivariate analysis of variance, and constrained analysis of principal coordinates. We find that the power of the various methods we investigate is sensitive to the number of species in the community, univariate species patterns, and the number of sites sampled over time. For increasing directional change scenarios, constrained analysis of principal coordinates was as or more powerful than permutational multivariate analysis of variance, the Mantel test was the least powerful. However, in our investigation of decreasing directional change, the Mantel test was typically as or more powerful than the other models.

  17. Space Communications and Navigation (SCaN) Network Simulation Tool Development and Its Use Cases

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Borgen, Richard; Nguyen, Sam; Segui, John; Stoenescu, Tudor; Wang, Shin-Ywan; Woo, Simon; Barritt, Brian; Chevalier, Christine; Eddy, Wesley

    2009-01-01

    In this work, we focus on the development of a simulation tool to assist in analysis of current and future (proposed) network architectures for NASA. Specifically, the Space Communications and Navigation (SCaN) Network is being architected as an integrated set of new assets and a federation of upgraded legacy systems. The SCaN architecture for the initial missions for returning humans to the moon and beyond will include the Space Network (SN) and the Near-Earth Network (NEN). In addition to SCaN, the initial mission scenario involves a Crew Exploration Vehicle (CEV), the International Space Station (ISS) and NASA Integrated Services Network (NISN). We call the tool being developed the SCaN Network Integration and Engineering (SCaN NI&E) Simulator. The intended uses of such a simulator are: (1) to characterize performance of particular protocols and configurations in mission planning phases; (2) to optimize system configurations by testing a larger parameter space than may be feasible in either production networks or an emulated environment; (3) to test solutions in order to find issues/risks before committing more significant resources needed to produce real hardware or flight software systems. We describe two use cases of the tool: (1) standalone simulation of CEV to ISS baseline scenario to determine network performance, (2) participation in Distributed Simulation Integration Laboratory (DSIL) tests to perform function testing and verify interface and interoperability of geographically dispersed simulations/emulations.

  18. On Correlated-noise Analyses Applied to Exoplanet Light Curves

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Loredo, Thomas J.; Lust, Nate B.; Blecic, Jasmina; Stemm, Madison

    2017-01-01

    Time-correlated noise is a significant source of uncertainty when modeling exoplanet light-curve data. A correct assessment of correlated noise is fundamental to determine the true statistical significance of our findings. Here, we review three of the most widely used correlated-noise estimators in the exoplanet field, the time-averaging, residual-permutation, and wavelet-likelihood methods. We argue that the residual-permutation method is unsound in estimating the uncertainty of parameter estimates. We thus recommend to refrain from this method altogether. We characterize the behavior of the time averaging’s rms-versus-bin-size curves at bin sizes similar to the total observation duration, which may lead to underestimated uncertainties. For the wavelet-likelihood method, we note errors in the published equations and provide a list of corrections. We further assess the performance of these techniques by injecting and retrieving eclipse signals into synthetic and real Spitzer light curves, analyzing the results in terms of the relative-accuracy and coverage-fraction statistics. Both the time-averaging and wavelet-likelihood methods significantly improve the estimate of the eclipse depth over a white-noise analysis (a Markov-chain Monte Carlo exploration assuming uncorrelated noise). However, the corrections are not perfect when retrieving the eclipse depth from Spitzer data sets, these methods covered the true (injected) depth within the 68% credible region in only ˜45%-65% of the trials. Lastly, we present our open-source model-fitting tool, Multi-Core Markov-Chain Monte Carlo (MC3). This package uses Bayesian statistics to estimate the best-fitting values and the credible regions for the parameters for a (user-provided) model. MC3 is a Python/C code, available at https://github.com/pcubillos/MCcubed.

  19. What Happens to bone health during and after spaceflight?

    NASA Technical Reports Server (NTRS)

    Sibonga, Jean D.; Evans, Harlan J.; Spector, Elisabeth R.; Maddocks, Mary J.; Smith, Scott A.; Shackelford, Linda C.; LeBlanc, Adrian D.

    2006-01-01

    Weightless conditions of space flight accelerate bone loss. There are no reports to date that address whether the bone that is lost during spaceflight could ever be recovered. Spaceinduced bone loss in astronauts is evaluated at the Johnson Space Center (JSC) by measurement of bone mineral density (BMD) by Dual-energy x-ray absorptiometry (DXA) scans. Astronauts are routinely scanned preflight and at various time points postflight (greater than or equal to Return+2 days). Two sets of BMD data were used to model spaceflight-induced loss and skeletal recovery in crewmembers following long-duration spaceflight missions (4-6 months). Group I was from astronauts (n=7) who were systematically scanned at multiple time points during the postflight period as part of a research protocol to investigate skeletal recovery. Group II came from a total of 49 sets of preflight and postflight data obtained by different protocols. These data were from 39 different crewmembers some of whom served on multiple flights. Changes in BMD (between pre- and postflight BMD) were plotted as a function of time (days-after-landing); plotted data were fitted to an exponential equation which enabled estimations of i) BMD change at day 0 after landing and ii) the number of days by which 50% of the lost bone is recovered (half-life). These fits were performed for BMD of the lumbar spine, trochanter, pelvis, femoral neck and calcaneus. There was consistency between the models for BMD recovery. Based upon the exponential model of BMD restoration, recovery following long-duration missions appears to be substantially complete in crewmembers within 36 months following return to Earth.

  20. Noncommutative de Rham Cohomology of Finite Groups

    NASA Astrophysics Data System (ADS)

    Castellani, L.; Catenacci, R.; Debernardi, M.; Pagani, C.

    We study de Rham cohomology for various differential calculi on finite groups G up to order 8. These include the permutation group S3, the dihedral group D4 and the quaternion group Q. Poincaré duality holds in every case, and under some assumptions (essentially the existence of a top form) we find that it must hold in general. A short review of the bicovariant (noncommutative) differential calculus on finite G is given for selfconsistency. Exterior derivative, exterior product, metric, Hodge dual, connections, torsion, curvature, and biinvariant integration can be defined algebraically. A projector decomposition of the braiding operator is found, and used in constructing the projector on the space of two-forms. By means of the braiding operator and the metric a knot invariant is defined for any finite group.

Top