Sample records for cladistic analysis based

  1. Cladistical Analysis of the Jovian and Saturnian Satellite Systems

    NASA Astrophysics Data System (ADS)

    Holt, Timothy. R.; Brown, Adrian. J.; Nesvorný, David; Horner, Jonathan; Carter, Brad

    2018-06-01

    Jupiter and Saturn each have complex systems of satellites and rings. These satellites can be classified into dynamical groups, implying similar formation scenarios. Recently, a larger number of additional irregular satellites have been discovered around both gas giants that have yet to be classified. The aim of this paper is to examine the relationships between the satellites and rings of the gas giants, using an analytical technique called cladistics. Cladistics is traditionally used to examine relationships between living organisms, the “tree of life.” In this work, we perform the first cladistical study of objects in a planetary science context. Our method uses the orbital, physical, and compositional characteristics of satellites to classify the objects in the Jovian and Saturnian systems. We find that the major relationships between the satellites in the two systems, such as families, as presented in previous studies, are broadly preserved. In addition, based on our analysis of the Jovian system, we identify a new retrograde irregular family, the Iocaste family, and suggest that the Phoebe family of the Saturnian system can be further divided into two subfamilies. We also propose that the Saturnian irregular families be renamed, to be consistent with the convention used in Jovian families. Using cladistics, we are also able to assign the new unclassified irregular satellites into families. Taken together, the results of this study demonstrate the potential use of the cladistical technique in the investigation of relationships between orbital bodies.

  2. Evolution of amino acid metabolism inferred through cladistic analysis.

    PubMed

    Cunchillos, Chomin; Lecointre, Guillaume

    2003-11-28

    Because free amino acids were most probably available in primitive abiotic environments, their metabolism is likely to have provided some of the very first metabolic pathways of life. What were the first enzymatic reactions to emerge? A cladistic analysis of metabolic pathways of the 16 aliphatic amino acids and 2 portions of the Krebs cycle was performed using four criteria of homology. The analysis is not based on sequence comparisons but, rather, on coding similarities in enzyme properties. The properties used are shared specific enzymatic activity, shared enzymatic function without substrate specificity, shared coenzymes, and shared functional family. The tree shows that the earliest pathways to emerge are not portions of the Krebs cycle but metabolisms of aspartate, asparagine, glutamate, and glutamine. The views of Horowitz (Horowitz, N. H. (1945) Proc. Natl. Acad. Sci. U. S. A. 31, 153-157) and Cordón (Cordón, F. (1990) Tratado Evolucionista de Biologia, Aguilar, Madrid, Spain), according to which the upstream reactions in the catabolic pathways and the downstream reactions in the anabolic pathways are the earliest in evolution, are globally corroborated; however, with some exceptions. These are due to later opportunistic connections of pathways (actually already suggested by these authors). Earliest enzymatic functions are mostly catabolic; they were deaminations, transaminations, and decarboxylations. From the consensus tree we extracted four time spans for amino acid metabolism development. For some amino acids catabolism and biosynthesis occurred at the same time (Asp, Glu, Lys, Leu, Ala, Val, Ile, Pro, Arg). For others ultimate reactions that use amino acids as a substrate or as a product are distinct in time, with catabolism preceding anabolism for Asn, Gln, and Cys and anabolism preceding catabolism for Ser, Met, and Thr. Cladistic analysis of the structure of biochemical pathways makes hypotheses in biochemical evolution explicit and parsimonious.

  3. A cladistic analysis of Aristotle's animal groups in the Historia animalium.

    PubMed

    von Lieven, Alexander Fürst; Humar, Marcel

    2008-01-01

    The Historia animalium (HA) of Aristotle contains an extraordinarily rich compilation of descriptions of animal anatomy, development, and behaviour. It is believed that Aristotle's aim in HA was to describe the correlations of characters rather than to classify or define animal groups. In order to assess if Aristotle, while organising his character correlations, referred to a pre-existing classification that underlies the descriptions in HA, we carried out a cladistic analysis according to the following procedure: by disentangeling 147 species and 40 higher taxa-designations from 157 predicates in the texts, we transcribed Aristotle's descriptions on anatomy and development of animals in books I-V of HA into a character matrix for a cladistic analysis. By analysing the distribution of characters as described in his books, we obtained a non-phylogenetic dendrogram displaying 58 monophyletic groups, 29 of which have equivalents among Aristotle's group designations. Eleven Aristotelian groupings turned out to be non-monophyletic, and six of them are inconsistent with the monophyletic groups. Twelve of 29 taxa without equivalents in Aristotle's works have equivalents in modern classifications. With this analysis we demonstate there exists a fairly consistent underlying classification in the zoological works of Aristotle. The peculiarities of Aristotle's character basis are discussed and the dendrogram is compared with a current phylogenetic tree.

  4. Investigating the origins of the Irregular satellites using Cladistics

    NASA Astrophysics Data System (ADS)

    Holt, Timothy; Horner, Jonti; Tylor, Christopher; Nesvorny, David; Brown, Adrian; Carter, Brad

    2017-10-01

    The irregular satellites of Jupiter and Saturn are thought to be objects captured during a period of instability in the early solar system. However, the precise origins of these small bodies remain elusive. We use cladistics, a technique traditionally used by biologists, to help constrain the origins of these bodies. Our research contributes to a growing body of work that uses cladistics in astronomy, collectively called astrocladistics. We present one of the first instances of cladistics being used in a planetary science context. The analysis uses physical and compositional characteristics of three prograde Jovian irregular satellites (Themisto, Leda & Himalia), five retrograde Jovian irregular satellites (Ananke, Carme, Pasiphae, Sinope & Callirrhoe), along with Phoebe, a retrograde irregular satellite of Saturn, and several other regular Jovian and Saturnian satellites. Each of these members are representatives of their respective taxonomic groups. The irregular satellites are compared with other well-studied solar system bodies, including satellites, terrestrial planets, main belt asteroids, comets, and minor planets. We find that the Jovian irregular satellites cluster with asteroids and Ceres. The Saturnian satellites studied here are found to form an association with the comets, adding to the narrative of exchange between the outer solar system and Saturnian orbital space. Both of these results demonstrate the utility of cladistics as an analysis tool for the planetary sciences.

  5. dCITE: Measuring Necessary Cladistic Information Can Help You Reduce Polytomy Artefacts in Trees.

    PubMed

    Wise, Michael J

    2016-01-01

    Biologists regularly create phylogenetic trees to better understand the evolutionary origins of their species of interest, and often use genomes as their data source. However, as more and more incomplete genomes are published, in many cases it may not be possible to compute genome-based phylogenetic trees due to large gaps in the assembled sequences. In addition, comparison of complete genomes may not even be desirable due to the presence of horizontally acquired and homologous genes. A decision must therefore be made about which gene, or gene combinations, should be used to compute a tree. Deflated Cladistic Information based on Total Entropy (dCITE) is proposed as an easily computed metric for measuring the cladistic information in multiple sequence alignments representing a range of taxa, without the need to first compute the corresponding trees. dCITE scores can be used to rank candidate genes or decide whether input sequences provide insufficient cladistic information, making artefactual polytomies more likely. The dCITE method can be applied to protein, nucleotide or encoded phenotypic data, so can be used to select which data-type is most appropriate, given the choice. In a series of experiments the dCITE method was compared with related measures. Then, as a practical demonstration, the ideas developed in the paper were applied to a dataset representing species from the order Campylobacterales; trees based on sequence combinations, selected on the basis of their dCITE scores, were compared with a tree constructed to mimic Multi-Locus Sequence Typing (MLST) combinations of fragments. We see that the greater the dCITE score the more likely it is that the computed phylogenetic tree will be free of artefactual polytomies. Secondly, cladistic information saturates, beyond which little additional cladistic information can be obtained by adding additional sequences. Finally, sequences with high cladistic information produce more consistent trees for the same taxa.

  6. dCITE: Measuring Necessary Cladistic Information Can Help You Reduce Polytomy Artefacts in Trees

    PubMed Central

    2016-01-01

    Biologists regularly create phylogenetic trees to better understand the evolutionary origins of their species of interest, and often use genomes as their data source. However, as more and more incomplete genomes are published, in many cases it may not be possible to compute genome-based phylogenetic trees due to large gaps in the assembled sequences. In addition, comparison of complete genomes may not even be desirable due to the presence of horizontally acquired and homologous genes. A decision must therefore be made about which gene, or gene combinations, should be used to compute a tree. Deflated Cladistic Information based on Total Entropy (dCITE) is proposed as an easily computed metric for measuring the cladistic information in multiple sequence alignments representing a range of taxa, without the need to first compute the corresponding trees. dCITE scores can be used to rank candidate genes or decide whether input sequences provide insufficient cladistic information, making artefactual polytomies more likely. The dCITE method can be applied to protein, nucleotide or encoded phenotypic data, so can be used to select which data-type is most appropriate, given the choice. In a series of experiments the dCITE method was compared with related measures. Then, as a practical demonstration, the ideas developed in the paper were applied to a dataset representing species from the order Campylobacterales; trees based on sequence combinations, selected on the basis of their dCITE scores, were compared with a tree constructed to mimic Multi-Locus Sequence Typing (MLST) combinations of fragments. We see that the greater the dCITE score the more likely it is that the computed phylogenetic tree will be free of artefactual polytomies. Secondly, cladistic information saturates, beyond which little additional cladistic information can be obtained by adding additional sequences. Finally, sequences with high cladistic information produce more consistent trees for the same taxa

  7. Pattern cladistics and the 'realism-antirealism debate' in the philosophy of biology.

    PubMed

    Vergara-Silva, Francisco

    2009-06-01

    Despite the amount of work that has been produced on the subject over the years, the 'transformation of cladistics' is still a misunderstood episode in the history of comparative biology. Here, I analyze two outstanding, highly contrasting historiographic accounts on the matter, under the perspective of an influential dichotomy in the philosophy of science: the opposition between Scientific Realism and Empiricism. Placing special emphasis on the notion of 'causal grounding' of morphological characters (sensu Olivier Rieppel) in modern developmental biology's (mechanistic) theories, I arrive at the conclusion that a 'new transformation of cladistics' is philosophically plausible. This 'reformed' understanding of 'pattern cladistics' entails retaining the interpretation of cladograms as 'schemes of synapomorphies', but in association to construing cladogram nodes as 'developmental-genetic taxic homologies', instead of 'standard Darwinian ancestors'. The reinterpretation of pattern cladistics presented here additionally proposes to take Bas Van Fraassen's 'constructive empiricism' as a philosophical stance that could properly support such analysis of developmental-genetic data for systematic purposes. The latter suggestion is justified through a reappraisal of previous ideas developed by prominent pattern cladists (mainly, Colin Patterson), which concerned a scientifically efficient 'observable/non-observable distinction' linked to the conceptual pair 'ontogeny and phylogeny'. Finally, I argue that a robust articulation of Antirealist alternatives in systematics may provide a rational basis for its disciplinary separation from evolutionary biology, as well as for a critical reconsideration of the proper role of certain Scientific Realist positions, currently popular in comparative biology.

  8. Cladistic analysis of Bantu languages: a new tree based on combined lexical and grammatical data

    NASA Astrophysics Data System (ADS)

    Rexová, Kateřina; Bastin, Yvonne; Frynta, Daniel

    2006-04-01

    The phylogeny of the Bantu languages is reconstructed by application of the cladistic methodology to the combined lexical and grammatical data (87 languages, 144 characters). A maximum parsimony tree and Bayesian analysis supported some previously recognized clades, e.g., that of eastern and southern Bantu languages. Moreover, the results revealed that Bantu languages south and east of the equatorial forest are probably monophyletic. It suggests an unorthodox scenario of Bantu expansion including (after initial radiation in their homelands and neighboring territories) just a single passage through rainforest areas followed by a subsequent divergence into major clades. The likely localization of this divergence is in the area west of the Great Lakes. It conforms to the view that demographic expansion and dispersal throughout the dry-forests and savanna regions of subequatorial Africa was associated with the acquisition of new technologies (iron metallurgy and grain cultivation).

  9. A Cladist is a systematist who seeks a natural classification: some comments on Quinn (2017).

    PubMed

    Williams, David M; Ebach, Malte C

    2018-01-01

    In response to Quinn (Biol Philos, 2017. 10.1007/s10539-017-9577-z) we identify cladistics to be about natural classifications and their discovery and thereby propose to add an eighth cladistic definition to Quinn's list, namely the systematist who seeks to discover natural classifications, regardless of their affiliation, theoretical or methodological justifications.

  10. The evolution of the centrifugal visual system of vertebrates. A cladistic analysis and new hypotheses.

    PubMed

    Repérant, J; Médina, M; Ward, R; Miceli, D; Kenigfest, N B; Rio, J P; Vesselkin, N P

    2007-01-01

    In a recent review of the available data concerning the centrifugal visual system (CVS) of vertebrates [Repérant, J., Ward, R., Miceli, D., Rio, J.P., Médina, M., Kenigfest, N.B., Vesselkin, N.P., 2006. The centrifugal visual system of vertebrates: a comparative analysis of its functional anatomical organization, Brain Res. Rev. 52, 1-57], we have shown that this feature of the visual system is not a particularity of birds, but is a permanent component of the vertebrate central nervous system which nevertheless shows considerable morphological and functional variation from one taxonomic group to another. Given these findings, the primary objective of the present article is an attempt to specify the evolutionary significance of this phylogenetic diversity. We begin by drawing up an inventory of this variation under several headings: the intracerebral location of the retinopetal neurons; the mode of intra-retinal arborizations of the centrifugal fibres and the nature of their targets; their neurochemical properties; and the afferent supplies of these neurons. We subsequently discuss these variations, particularly that of the intracerebral location of the retinopetal neurons during development and in adult forms, using the neuromeric terminology and in the framework of cladistic analysis, and seek to interpret them in a phylogenetic context. From this analysis, it becomes evident that the CVS is not a homogeneous entity formed by neurons with a common embryological origin, but rather a collection of at least eight distinct subsystems arising in very different regions of the neuraxis. These are the olfacto-retinal, dorsal thalamo-retinal, ventral thalamo-retinal, pretecto-retinal, tecto-retinal, tegmento-mesencephalo-retinal, dorsal isthmo-retinal and ventral isthmo-retinal systems. The olfacto-retinal system, which is probably absent in Agnatha, appears to be a pleisiomorphic characteristic of all Gnathostomata, while on the other hand the tegmento

  11. Morphological cladistic study of coregonine fishes

    USGS Publications Warehouse

    Smith, G.R.; Todd, T.N.

    1992-01-01

    A cladistic analysis of 50 characters from 26 taxa of coregonine fishes and two outgroup taxa yields a phylogenetic tree with two major branches, best summarized as two genera - Prosopium and Coregonus. Presence of teeth on the palatine, long maxillae, and long supra-maxillae are primitive, whereas loss of teeth, short or notched maxillae, and short supermaxillae are derived traits. P. coulteri and C. huntsmani are morphologically and phylogenetically primitive members of their groups. The widespread species, P. cylindraceum and P. williamsoni are morphologically advanced in parallel with the subgenus Coregonus (whitefishes): they share subterminal mouths, short jaws, and reduced teeth. Prosopium gemmifer parallels the ciscoes, subgenus Leucichthys. The whitefishes, C. ussuriensis, C. lavaretus, C. clupeaformis, and C. nasus are a monophyletic group, the subgenus Coregonus. The subgenus Leucichthys is a diverse, relatively plesiomorphic assemblage, widespread in the Holarctic region. This assemblage includes the inconnu, Stenodus.

  12. Multivariate and Cladistic Analyses of Isolated Teeth Reveal Sympatry of Theropod Dinosaurs in the Late Jurassic of Northern Germany.

    PubMed

    Gerke, Oliver; Wings, Oliver

    2016-01-01

    Remains of theropod dinosaurs are very rare in Northern Germany because the area was repeatedly submerged by a shallow epicontinental sea during the Mesozoic. Here, 80 Late Jurassic theropod teeth are described of which the majority were collected over decades from marine carbonates in nowadays abandoned and backfilled quarries of the 19th century. Eighteen different morphotypes (A-R) could be distinguished and 3D models based on micro-CT scans of the best examples of all morphotypes are included as supplements. The teeth were identified with the assistance of discriminant function analysis and cladistic analysis based on updated datamatrices. The results show that a large variety of theropod groups were present in the Late Jurassic of northern Germany. Identified specimens comprise basal Tyrannosauroidea, as well as Allosauroidea, Megalosauroidea cf. Marshosaurus, Megalosauridae cf. Torvosaurus and probably Ceratosauria. The formerly reported presence of Dromaeosauridae in the Late Jurassic of northern Germany could not be confirmed. Some teeth of this study resemble specimens described as pertaining to Carcharodontosauria (morphotype A) and Abelisauridae (morphotype K). This interpretation is however, not supported by discriminant function analysis and cladistic analysis. Two smaller morphotypes (N and Q) differ only in some probably size-related characteristics from larger morphotypes (B and C) and could well represent juveniles of adult specimens. The similarity of the northern German theropods with groups from contemporaneous localities suggests faunal exchange via land-connections in the Late Jurassic between Germany, Portugal and North America.

  13. A Cladistic Analysis of Phenotypic Associations with Haplotypes Inferred from Restriction Endonuclease Mapping. IV. Nested Analyses with Cladogram Uncertainty and Recombination

    PubMed Central

    Templeton, A. R.; Sing, C. F.

    1993-01-01

    We previously developed an analytical strategy based on cladistic theory to identify subsets of haplotypes that are associated with significant phenotypic deviations. Our initial approach was limited to segments of DNA in which little recombination occurs. In such cases, a cladogram can be constructed from the restriction site data to estimate the evolutionary steps that interrelate the observed haplotypes to one another. The cladogram is then used to define a nested statistical design for identifying mutational steps associated with significant phenotypic deviations. The central assumption behind this strategy is that a mutation responsible for a particular phenotypic effect is embedded within the evolutionary history that is represented by the cladogram. The power of this approach depends on the accuracy of the cladogram in portraying the evolutionary history of the DNA region. This accuracy can be diminished both by recombination and by uncertainty in the estimated cladogram topology. In a previous paper, we presented an algorithm for estimating the set of likely cladograms and recombination events. In this paper we present an algorithm for defining a nested statistical design under cladogram uncertainty and recombination. Given the nested design, phenotypic associations can be examined using either a nested analysis of variance (for haploids or homozygous strains) or permutation testing (for outcrossed, diploid gene regions). In this paper we also extend this analytical strategy to include categorical phenotypes in addition to quantitative phenotypes. Some worked examples are presented using Drosophila data sets. These examples illustrate that having some recombination may actually enhance the biological inferences that may derived from a cladistic analysis. In particular, recombination can be used to assign a physical localization to a given subregion for mutations responsible for significant phenotypic effects. PMID:8100789

  14. Mitochondrial DNA haplogroup phylogeny of the dog: Proposal for a cladistic nomenclature.

    PubMed

    Fregel, Rosa; Suárez, Nicolás M; Betancor, Eva; González, Ana M; Cabrera, Vicente M; Pestano, José

    2015-05-01

    Canis lupus familiaris mitochondrial DNA analysis has increased in recent years, not only for the purpose of deciphering dog domestication but also for forensic genetic studies or breed characterization. The resultant accumulation of data has increased the need for a normalized and phylogenetic-based nomenclature like those provided for human maternal lineages. Although a standardized classification has been proposed, haplotype names within clades have been assigned gradually without considering the evolutionary history of dog mtDNA. Moreover, this classification is based only on the D-loop region, proven to be insufficient for phylogenetic purposes due to its high number of recurrent mutations and the lack of relevant information present in the coding region. In this study, we design 1) a refined mtDNA cladistic nomenclature from a phylogenetic tree based on complete sequences, classifying dog maternal lineages into haplogroups defined by specific diagnostic mutations, and 2) a coding region SNP analysis that allows a more accurate classification into haplogroups when combined with D-loop sequencing, thus improving the phylogenetic information obtained in dog mitochondrial DNA studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Evo-SETI: A Mathematical Tool for Cladistics, Evolution, and SETI

    PubMed Central

    Maccone, Claudio

    2017-01-01

    The discovery of new exoplanets makes us wonder where each new exoplanet stands along its way to develop life as we know it on Earth. Our Evo-SETI Theory is a mathematical way to face this problem. We describe cladistics and evolution by virtue of a few statistical equations based on lognormal probability density functions (pdf) in the time. We call b-lognormal a lognormal pdf starting at instant b (birth). Then, the lifetime of any living being becomes a suitable b-lognormal in the time. Next, our “Peak-Locus Theorem” translates cladistics: each species created by evolution is a b-lognormal whose peak lies on the exponentially growing number of living species. This exponential is the mean value of a stochastic process called “Geometric Brownian Motion” (GBM). Past mass extinctions were all-lows of this GBM. In addition, the Shannon Entropy (with a reversed sign) of each b-lognormal is the measure of how evolved that species is, and we call it EvoEntropy. The “molecular clock” is re-interpreted as the EvoEntropy straight line in the time whenever the mean value is exactly the GBM exponential. We were also able to extend the Peak-Locus Theorem to any mean value other than the exponential. For example, we derive in this paper for the first time the EvoEntropy corresponding to the Markov-Korotayev (2007) “cubic” evolution: a curve of logarithmic increase. PMID:28383497

  16. Evo-SETI: A Mathematical Tool for Cladistics, Evolution, and SETI.

    PubMed

    Maccone, Claudio

    2017-04-06

    The discovery of new exoplanets makes us wonder where each new exoplanet stands along its way to develop life as we know it on Earth. Our Evo-SETI Theory is a mathematical way to face this problem. We describe cladistics and evolution by virtue of a few statistical equations based on lognormal probability density functions (pdf) in the time . We call b -lognormal a lognormal pdf starting at instant b (birth). Then, the lifetime of any living being becomes a suitable b -lognormal in the time . Next, our "Peak-Locus Theorem" translates cladistics : each species created by evolution is a b -lognormal whose peak lies on the exponentially growing number of living species. This exponential is the mean value of a stochastic process called "Geometric Brownian Motion" (GBM). Past mass extinctions were all-lows of this GBM. In addition, the Shannon Entropy (with a reversed sign) of each b -lognormal is the measure of how evolved that species is, and we call it EvoEntropy. The "molecular clock" is re-interpreted as the EvoEntropy straight line in the time whenever the mean value is exactly the GBM exponential. We were also able to extend the Peak-Locus Theorem to any mean value other than the exponential. For example, we derive in this paper for the first time the EvoEntropy corresponding to the Markov-Korotayev (2007) "cubic" evolution: a curve of logarithmic increase.

  17. Brain, calvarium, cladistics: A new approach to an old question, who are modern humans and Neandertals?

    PubMed

    Mounier, Aurélien; Balzeau, Antoine; Caparros, Miguel; Grimaud-Hervé, Dominique

    2016-03-01

    The evolutionary history of the genus Homo is the focus of major research efforts in palaeoanthropology. However, the use of palaeoneurology to infer phylogenies of our genus is rare. Here we use cladistics to test the importance of the brain in differentiating and defining Neandertals and modern humans. The analysis is based on morphological data from the calvarium and endocast of Pleistocene fossils and results in a single most parsimonious cladogram. We demonstrate that the joint use of endocranial and calvarial features with cladistics provides a unique means to understand the evolution of the genus Homo. The main results of this study indicate that: (i) the endocranial features are more phylogenetically informative than the characters from the calvarium; (ii) the specific differentiation of Neandertals and modern humans is mostly supported by well-known calvarial autapomorphies; (iii) the endocranial anatomy of modern humans and Neandertals show strong similarities, which appeared in the fossil record with the last common ancestor of both species; and (iv) apart from encephalisation, human endocranial anatomy changed tremendously during the end of the Middle Pleistocene. This may be linked to major cultural and technological novelties that had happened by the end of the Middle Pleistocene (e.g., expansion of the Middle Stone Age (MSA) in Africa and Mousterian in Europe). The combined study of endocranial and exocranial anatomy offers opportunities to further understand human evolution and the implication for the phylogeny of our genus. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Taxonomic revision and cladistic analysis of the Neotropical genus Acrochaeta Wiedemann, 1830 (Diptera: Stratiomyidae: Sarginae).

    PubMed

    Fachin, Diego Aguilar; Amorim, Dalton De Souza

    2015-11-30

    The Neotropical genus Acrochaeta Wiedemann is revised and a cladistics analysis of the genus based on morphological characters is presented. This paper raises the total number of extant Acrochaeta species from 10 to 14 with the description of nine new species, the synonymy of one species, the transfer of five species to other genera and the transfer of one species of Merosargus to Acrochaeta. The new species described (of which eight are from Brazil and one from Bolivia and Peru) are Acrochaeta asapha nov. sp., A. balbii nov. sp., A. dichrostyla nov. sp., A. polychaeta nov. sp., A. pseudofasciata nov. sp., A. pseudopolychaeta nov. sp., A. rhombostyla nov. sp. A. ruschii nov. sp. and A. stigmata nov. sp. The primary types of all Acrochaeta species were studied at least from photos, when possible with the study of dissected male or female terminalia. A. mexicana Lindner is proposed as a junior synonym of A. flaveola Bigot. M. chalconota (Brauer) comb. nov., M. degenerata (Lindner) comb. nov., M. longiventris (Enderlein) comb. nov. and M. picta (Brauer) comb. nov. are herein transferred from Acrochaeta to Merosargus Loew, and Chrysochlorina elegans (Perty) comb. nov. is transferred from Acrochaeta to Chrysochlorina James. A. convexifrons (McFadden) comb. nov. is transferred from Merosargus to Acrochaeta. The limits of the genus and its insertion in the Sarginae are considered, and an updated generic diagnosis is provided. All species of the genus are redescribed and diagnosed, and illustrated with photos of the habitus, thorax, wing, and drawings of the antenna and male and female terminalia. Distribution maps are provided for the species, along with an identification key for adults of all species. Parsimony analyses were carried out under equal and implied weight. Our matrix includes 43 terminal taxa--of which 26 are outgroup species from four different sargine genera--and 59 adult morphological characters. The phylogenetic analysis supports the monophyly of

  19. The evolution of the dorsal thalamus of jawed vertebrates, including mammals: cladistic analysis and a new hypothesis.

    PubMed

    Butler, A B

    1994-01-01

    The evolution of the dorsal thalamus in various vertebrate lineages of jawed vertebrates has been an enigma, partly due to two prevalent misconceptions: the belief that the multitude of nuclei in the dorsal thalamus of mammals could be meaningfully compared neither with the relatively few nuclei in the dorsal thalamus of anamniotes nor with the intermediate number of dorsal thalamic nuclei of other amniotes and a definition of the dorsal thalamus that too narrowly focused on the features of the dorsal thalamus of mammals. The cladistic analysis carried out here allows us to recognize which features are plesiomorphic and which apomorphic for the dorsal thalamus of jawed vertebrates and to then reconstruct the major changes that have occurred in the dorsal thalamus over evolution. Embryological data examined in the context of Von Baerian theory (embryos of later-descendant species resemble the embryos of earlier-descendant species to the point of their divergence) supports a new 'Dual Elaboration Hypothesis' of dorsal thalamic evolution generated from this cladistic analysis. From the morphotype for an early stage in the embryological development of the dorsal thalamus of jawed vertebrates, the divergent, sequential stages of the development of the dorsal thalamus are derived for each major radiation and compared. The new hypothesis holds that the dorsal thalamus comprises two basic divisions--the collothalamus and the lemnothalamus--that receive their predominant input from the midbrain roof and (plesiomorphically) from lemniscal pathways, including the optic tract, respectively. Where present, the collothalamic, midbrain-sensory relay nuclei are homologous to each other in all vertebrate radiations as discrete nuclei. Within the lemnothalamus, the dorsal lateral geniculate nucleus of mammals and the dorsal lateral optic nucleus of non-synapsid amniotes (diapsid reptiles, birds and turtles) are homologous as discrete nuclei; most or all of the ventral nuclear group

  20. Cognitive cladistics and cultural override in Hominid spatial cognition

    PubMed Central

    Haun, Daniel B. M.; Rapold, Christian J.; Call, Josep; Janzen, Gabriele; Levinson, Stephen C.

    2006-01-01

    Current approaches to human cognition often take a strong nativist stance based on Western adult performance, backed up where possible by neonate and infant research and almost never by comparative research across the Hominidae. Recent research suggests considerable cross-cultural differences in cognitive strategies, including relational thinking, a domain where infant research is impossible because of lack of cognitive maturation. Here, we apply the same paradigm across children and adults of different cultures and across all nonhuman great ape genera. We find that both child and adult spatial cognition systematically varies with language and culture but that, nevertheless, there is a clear inherited bias for one spatial strategy in the great apes. It is reasonable to conclude, we argue, that language and culture mask the native tendencies in our species. This cladistic approach suggests that the correct perspective on human cognition is neither nativist uniformitarian nor “blank slate” but recognizes the powerful impact that language and culture can have on our shared primate cognitive biases. PMID:17079489

  1. Cladistic analysis of extant and fossil African papionins using craniodental data.

    PubMed

    Gilbert, Christopher C

    2013-05-01

    This study examines African papionin phylogenetic history through a comprehensive cladistic analysis of extant and fossil craniodental morphology using both quantitative and qualitative characters. To account for the well-documented influence of allometry on the papionin skull, the general allometric coding method was applied to characters determined to be significantly affected by allometry. Results of the analyses suggest that Parapapio, Pliopapio, and Papio izodi are stem African papionin taxa. Crown Plio-Pleistocene African papionin taxa include Gorgopithecus, Lophocebus cf. albigena, Procercocebus, Soromandrillus (new genus defined herein) quadratirostris, and, most likely, Dinopithecus. Furthermore, S. quadratirostris is a member of a clade also containing Mandrillus, Cercocebus, and Procercocebus; ?Theropithecus baringensis is strongly supported as a primitive member of the genus Theropithecus; Gorgopithecus is closely related to Papio and Lophocebus; and Theropithecus is possibly the most primitive crown African papionin taxon. Finally, character transformation analyses identify a series of morphological transformations during the course of papionin evolution. The origin of crown African papionins is diagnosed, at least in part, by the appearance of definitive and well-developed male maxillary ridges and maxillary fossae. Among crown African papionins, Papio, Lophocebus, and Gorgopithecus are further united by the most extensive development of the maxillary fossae. The Soromandrillus/Mandrillus/Cercocebus/Procercocebus clade is diagnosed by upturned nuchal crests (especially in males), widely divergent temporal lines (especially in males), medially oriented maxillary ridges in males, medially oriented inferior petrous processes, and a tendency to enlarge the premolars as an adaptation for hard-object food processing. The adaptive origins of the genus Theropithecus appear associated with a diet requiring an increase in size of the temporalis, the optimal

  2. A cladistically based reinterpretation of the taxonomy of two Afrotropical tenebrionid genera Ectateus Koch, 1956 and Selinus Mulsant & Rey, 1853 (Coleoptera, Tenebrionidae, Platynotina).

    PubMed

    Kamiński, Marcin Jan

    2014-01-01

    On the basis of a newly performed cladistic analysis a new classification of the representatives of two Afrotropical tenebrionid genera, Ectateus Koch, 1956 and Selinus Mulsant & Rey, 1853 sensu Iwan 2002a, is provided. Eleoselinus is described as a new genus. The genus Monodius, previously synonymized with Selinus by Iwan (2002), is redescribed and considered as a separate genus. Following new combinations are proposed: Ectateus calcaripes (Gebien, 1904), Monodius laevistriatus (Fairmaire, 1897), Monodius lamottei (Gridelli, 1954), Monodius plicicollis (Fairmaire, 1897), Eleoselinus villiersi (Ardoin, 1965) and Eleoselinus ursynowiensis (Kamiński, 2011). Neotype for Ectateus calcaripes and lectotypes for E. crenatus (Fairmaire, 1897), E. ghesquierei Koch, 1956 and Monodius malaisei malaisei Koch, 1956 are designated to fix the taxonomic status of these taxa. The following synonymies are proposed: Selinus monardi Kaszab, 1951 and Ectateus latipennis Koch, 1956 with E. crenatus (Fairmaire, 1897). Identification keys are provided to all known species of Ectateus sensu novum, Eleoselinus, Monodius and Selinus sensu novum.

  3. Integrating restriction site-associated DNA sequencing (RAD-seq) with morphological cladistic analysis clarifies evolutionary relationships among major species groups of bee orchids

    PubMed Central

    Sramkó, Gábor; Paun, Ovidiu

    2018-01-01

    Abstract Background and Aims Bee orchids (Ophrys) have become the most popular model system for studying reproduction via insect-mediated pseudo-copulation and for exploring the consequent, putatively adaptive, evolutionary radiations. However, despite intensive past research, both the phylogenetic structure and species diversity within the genus remain highly contentious. Here, we integrate next-generation sequencing and morphological cladistic techniques to clarify the phylogeny of the genus. Methods At least two accessions of each of the ten species groups previously circumscribed from large-scale cloned nuclear ribosomal internal transcibed spacer (nrITS) sequencing were subjected to restriction site-associated sequencing (RAD-seq). The resulting matrix of 4159 single nucleotide polymorphisms (SNPs) for 34 accessions was used to construct an unrooted network and a rooted maximum likelihood phylogeny. A parallel morphological cladistic matrix of 43 characters generated both polymorphic and non-polymorphic sets of parsimony trees before being mapped across the RAD-seq topology. Key Results RAD-seq data strongly support the monophyly of nine out of ten groups previously circumscribed using nrITS and resolve three major clades; in contrast, supposed microspecies are barely distinguishable. Strong incongruence separated the RAD-seq trees from both the morphological trees and traditional classifications; mapping of the morphological characters across the RAD-seq topology rendered them far more homoplastic. Conclusions The comparatively high level of morphological homoplasy reflects extensive convergence, whereas the derived placement of the fusca group is attributed to paedomorphic simplification. The phenotype of the most recent common ancestor of the extant lineages is inferred, but it post-dates the majority of the character-state changes that typify the genus. RAD-seq may represent the high-water mark of the contribution of molecular phylogenetics to

  4. A phylogenetic analysis of the megadiverse Chalcidoidea (Hymenoptera)

    USDA-ARS?s Scientific Manuscript database

    Chalcidoidea (Hymenoptera) are extremely diverse with an estimated 500,000 species. We present the first phylogenetic analysis of the superfamily based on a cladistic analysis of both morphological and molecular data. A total of 233 morphological characters were scored for 300 taxa and 265 genera, a...

  5. Integrating restriction site-associated DNA sequencing (RAD-seq) with morphological cladistic analysis clarifies evolutionary relationships among major species groups of bee orchids.

    PubMed

    Bateman, Richard M; Sramkó, Gábor; Paun, Ovidiu

    2018-01-25

    Bee orchids (Ophrys) have become the most popular model system for studying reproduction via insect-mediated pseudo-copulation and for exploring the consequent, putatively adaptive, evolutionary radiations. However, despite intensive past research, both the phylogenetic structure and species diversity within the genus remain highly contentious. Here, we integrate next-generation sequencing and morphological cladistic techniques to clarify the phylogeny of the genus. At least two accessions of each of the ten species groups previously circumscribed from large-scale cloned nuclear ribosomal internal transcibed spacer (nrITS) sequencing were subjected to restriction site-associated sequencing (RAD-seq). The resulting matrix of 4159 single nucleotide polymorphisms (SNPs) for 34 accessions was used to construct an unrooted network and a rooted maximum likelihood phylogeny. A parallel morphological cladistic matrix of 43 characters generated both polymorphic and non-polymorphic sets of parsimony trees before being mapped across the RAD-seq topology. RAD-seq data strongly support the monophyly of nine out of ten groups previously circumscribed using nrITS and resolve three major clades; in contrast, supposed microspecies are barely distinguishable. Strong incongruence separated the RAD-seq trees from both the morphological trees and traditional classifications; mapping of the morphological characters across the RAD-seq topology rendered them far more homoplastic. The comparatively high level of morphological homoplasy reflects extensive convergence, whereas the derived placement of the fusca group is attributed to paedomorphic simplification. The phenotype of the most recent common ancestor of the extant lineages is inferred, but it post-dates the majority of the character-state changes that typify the genus. RAD-seq may represent the high-water mark of the contribution of molecular phylogenetics to understanding evolution within Ophrys; further progress will require

  6. Revision of the African pollen beetle genera Tarchonanthogethes and Xenostrongylogethes, with insect-host plant relationships, identification key, and cladistic analysis of the Anthystrix genus-complex (Coleoptera: Nitidulidae: Meligethinae).

    PubMed

    Audisio, Paolo; Cline, Andrew R; Trizzino, Marco; Mancini, Emiliano; Antonini, Gloria; Sabatelli, Simone; Cerretti, Pierfilippo

    2015-02-19

    The Afrotropical endemic pollen beetle genera Tarchonanthogethes Audisio & Cline and Xenostrongylogethes Audisio & Cline, of the Anthystrix genus-complex, are revised. Eleven new species of Tarchonanthogethes (T. autumnalis, sp. nov., T. bisignatus, sp. nov., T. fasciatus, sp. nov., T. gratiellae, sp. nov., T. hermani, sp. nov., T. hystrix, sp. nov., T. lilliputianus, sp. nov., T. maasai, sp. nov., T. manconiae, sp. nov., T. pectinipes, sp. nov., T. thalycriformis, sp. nov.) and one new Xenostrongylogethes (X. cychramoides, sp. nov.) are described, illustrated and compared with related taxa. Tarchonanthogethes hirtus Kirejtshuk & Easton, 1988 is synonymized with T. martini (syn. nov.). Meligethes assutus Easton, 1960 from Kenya is transferred from Afrogethes Audisio & Cline to Tarchonanthogethes (comb. nov.). Meligethes singularis Grouvelle, 1919 from southern Africa is transferred from Tarchonanthogethes to Meligethinus Grouvelle, 1906 (comb. nov.). Larval host-plants for Tarchonanthogethes and Xenostrongylogethes include dioecious bushes and trees of Tarchonantheae Asteraceae (genera Brachylaena R.Br. and Tarchonanthus L.). All species currently attributed to the genera Anthystrix Kirejtshuk, Sebastiangethes Audisio, Kirk-Spriggs & Cline, Tarchonanthogethes and Xenostrongylogethes (Anthystrix genus-complex) are included in a morphology-based cladistic analysis to provide a rigorous hypothesis of phylogenetic relationships. An identification key to all 25 known species in the Anthystrix genus-complex, including all available data on insect host plant relationships, is presented.

  7. Revision, cladistic analysis and biogeography of Typhochlaena C. L. Koch, 1850, Pachistopelma Pocock, 1901 and Iridopelma Pocock, 1901 (Araneae, Theraphosidae, Aviculariinae).

    PubMed

    Bertani, Rogério

    2012-01-01

    Three aviculariine genera endemic to Brazil are revised. Typhochlaena C. L. Koch, 1850 is resurrected, including five species; Pachistopelma Pocock, 1901 includes two species; and Iridopelma Pocock, 1901, six species. Nine species are newly described: Typhochlaena ammasp. n., Typhochlaena costaesp. n., Typhochlaena curumimsp. n., Typhochlaena paschoalisp. n., Pachistopelma bromelicolasp. n., Iridopelma katiaesp. n., Iridopelma marcoisp. n., Iridopelma oliveiraisp. n. and Iridopelma vaninisp. n. Three new synonymies are established: Avicularia pulchra Mello-Leitão, 1933 and Avicularia recifiensis Struchen & Brändle, 1996 are junior synonyms of Pachistopelma rufonigrum Pocock, 1901 syn. n., and Avicularia palmicola Mello-Leitão, 1945 is a junior synonym of Iridopelma hirsutum Pocock, 1901 syn. n.Pachistopelma concolor Caporiacco, 1947 is transferred to Tapinauchenius Ausserer, 1871, making the new combination Tapinauchenius concolor (Caporiacco, 1947)comb. n. Lectotypes are newly designed for Pachistopelma rufonigrum Pocock, 1901 , Iridopelma hirsutum Pocock, 1901 and Pachistopelma concolor Caporiacco, 1947. Cladistic analyses using both equal and implied weights were carried out with a matrix comprising 62 characters and 38 terminal taxa. The chosen cladogram found with X-Pee-Wee and concavity 6 suggests they are monophyletic. All species are keyed and mapped and information on species habitat and area cladograms are presented. Discussion on biogeography and conservation is provided.

  8. Revision, cladistic analysis and biogeography of Typhochlaena C. L. Koch, 1850, Pachistopelma Pocock, 1901 and Iridopelma Pocock, 1901 (Araneae, Theraphosidae, Aviculariinae)

    PubMed Central

    Bertani, Rogério

    2012-01-01

    Abstract Three aviculariine genera endemic to Brazil are revised. Typhochlaena C. L. Koch, 1850 is resurrected, including five species; Pachistopelma Pocock, 1901 includes two species; and Iridopelma Pocock, 1901, six species. Nine species are newly described: Typhochlaena amma sp. n., Typhochlaena costae sp. n., Typhochlaena curumim sp. n., Typhochlaena paschoali sp. n., Pachistopelma bromelicola sp. n., Iridopelma katiae sp. n., Iridopelma marcoi sp. n., Iridopelma oliveirai sp. n. and Iridopelma vanini sp. n. Three new synonymies are established: Avicularia pulchra Mello-Leitão, 1933 and Avicularia recifiensis Struchen & Brändle, 1996 are junior synonyms of Pachistopelma rufonigrum Pocock, 1901 syn. n., and Avicularia palmicola Mello-Leitão, 1945 is a junior synonym of Iridopelma hirsutum Pocock, 1901 syn. n. Pachistopelma concolor Caporiacco, 1947 is transferred to Tapinauchenius Ausserer, 1871, making the new combination Tapinauchenius concolor (Caporiacco, 1947) comb. n. Lectotypes are newly designed for Pachistopelma rufonigrum Pocock, 1901 , Iridopelma hirsutum Pocock, 1901 and Pachistopelma concolor Caporiacco, 1947. Cladistic analyses using both equal and implied weights were carried out with a matrix comprising 62 characters and 38 terminal taxa. The chosen cladogram found with X-Pee-Wee and concavity 6 suggests they are monophyletic. All species are keyed and mapped and information on species habitat and area cladograms are presented. Discussion on biogeography and conservation is provided. PMID:23166476

  9. Homeopathy and systematics: a systematic analysis of the therapeutic effects of the plant species used in homeopathy.

    PubMed

    Bharatan, V

    2008-07-01

    The therapeutic effects of the plant species used in homeopathy have never been subjected to systematic analysis. A survey of the various Materiae Medicae shows that over 800 plant species are the source of medicines in homeopathy. As these medicines are considered related to one another with respect to their therapeutic effects for treating similar symptoms, the aim is to classify and map them using the concept of homology. This involves placing the discipline of homeopathy into a comparative framework using these plant medicines as taxa, therapeutic effects as characters, and contemporary cladistic techniques to analyse these relationships. The results are compared using cladograms based on different data sets used in biology (e.g. morphological characters and DNA sequences) to test whether similar cladistic patterns exist among these medicines. By classifying the therapeutic actions, genuine homologies can be distinguished from homoplasies. As this is a comparative study it has been necessary first to update the existing nomenclature of the plant species in the homeopathic literature in line with the current International Code of Botanical Nomenclature.

  10. Cladistic analyses of behavioural variation in wild Pan troglodytes: exploring the chimpanzee culture hypothesis.

    PubMed

    Lycett, Stephen J; Collard, Mark; McGrew, William C

    2009-10-01

    Long-term field studies have revealed considerable behavioural differences among groups of wild Pan troglodytes. Here, we report three sets of cladistic analyses that were designed to shed light on issues relating to this interpopulation variation that are of particular relevance to palaeoanthropology. In the first set of analyses, we focused on the proximate cause of the variation. Some researchers have argued that it is cultural, while others have suggested that it is the result of genetic differences. Because the eastern and western subspecies of P. troglodytes are well differentiated genetically while groups within the subspecies are not, we reasoned that if the genetic hypothesis is correct, the phylogenetic signal should be stronger when data from the eastern and western subspecies are analysed together compared to when data from only the eastern subspecies are analysed. Using randomisation procedures, we found that the phylogenetic signal was substantially stronger with in a single subspecies rather than with two. The results of the first sets of analyses, therefore, were inconsistent with the predictions of the genetic hypothesis. The other two sets of analyses built on the results of the first and assumed that the intergroup behavioural variation is cultural in nature. Recent work has shown that, contrary to what anthropologists and archaeologists have long believed, vertical intergroup transmission is often more important than horizontal intergroup transmission in human cultural evolution. In the second set of analyses, we sought to determine how important vertical transmission has been in the evolution of chimpanzee cultural diversity. The first analysis we carried out indicated that the intergroup similarities and differences in behaviour are consistent with the divergence of the western and eastern subspecies, which is what would be expected if vertical intergroup transmission has been the dominant process. In the second analysis, we found that the

  11. The Cladistic Basis for the Phylogenetic Diversity (PD) Measure Links Evolutionary Features to Environmental Gradients and Supports Broad Applications of Microbial Ecology’s “Phylogenetic Beta Diversity” Framework

    PubMed Central

    Faith, Daniel P.; Lozupone, Catherine A.; Nipperess, David; Knight, Rob

    2009-01-01

    The PD measure of phylogenetic diversity interprets branch lengths cladistically to make inferences about feature diversity. PD calculations extend conventional species-level ecological indices to the features level. The “phylogenetic beta diversity” framework developed by microbial ecologists calculates PD-dissimilarities between community localities. Interpretation of these PD-dissimilarities at the feature level explains the framework’s success in producing ordinations revealing environmental gradients. An example gradients space using PD-dissimilarities illustrates how evolutionary features form unimodal response patterns to gradients. This features model supports new application of existing species-level methods that are robust to unimodal responses, plus novel applications relating to climate change, commercial products discovery, and community assembly. PMID:20087461

  12. Analysis of Toxic and Non-Toxic Alexandrium (Dinophyceae) Species Using Ribosomal RNA Gene Sequences

    DTIC Science & Technology

    1993-02-01

    Therriault, J.-C. (1988). Cladistic analysis of electrophoretic variants within the toxic dinoflagellate genus Protogonyaulax. Botanica Marina 31: 39- 51. 8... Botanica Marina 34: 575-587. Halegraeff, G. M., and Bolch, C.J. (1992). Transport of toxic dinoflagellate cysts via ship’s ballast water: implications...analysis of electrophoretic variants within the toxic dinoflagellate genus Protogonv-u.!a,. Botanica Marina 31: 39-51. Curran, J., Baillie, D.L

  13. Higher-order phylogeny of modern birds (Theropoda, Aves: Neornithes) based on comparative anatomy. II. Analysis and discussion

    PubMed Central

    LIVEZEY, BRADLEY C; ZUSI, RICHARD L

    2007-01-01

    In recent years, avian systematics has been characterized by a diminished reliance on morphological cladistics of modern taxa, intensive palaeornithogical research stimulated by new discoveries and an inundation by analyses based on DNA sequences. Unfortunately, in contrast to significant insights into basal origins, the broad picture of neornithine phylogeny remains largely unresolved. Morphological studies have emphasized characters of use in palaeontological contexts. Molecular studies, following disillusionment with the pioneering, but non-cladistic, work of Sibley and Ahlquist, have differed markedly from each other and from morphological works in both methods and findings. Consequently, at the turn of the millennium, points of robust agreement among schools concerning higher-order neornithine phylogeny have been limited to the two basalmost and several mid-level, primary groups. This paper describes a phylogenetic (cladistic) analysis of 150 taxa of Neornithes, including exemplars from all non-passeriform families, and subordinal representatives of Passeriformes. Thirty-five outgroup taxa encompassing Crocodylia, predominately theropod Dinosauria, and selected Mesozoic birds were used to root the trees. Based on study of specimens and the literature, 2954 morphological characters were defined; these characters have been described in a companion work, approximately one-third of which were multistate (i.e. comprised at least three states), and states within more than one-half of these multistate characters were ordered for analysis. Complete heuristic searches using 10 000 random-addition replicates recovered a total solution set of 97 well-resolved, most-parsimonious trees (MPTs). The set of MPTs was confirmed by an expanded heuristic search based on 10 000 random-addition replicates and a full ratchet-augmented exploration to ascertain global optima. A strict consensus tree of MPTs included only six trichotomies, i.e. nodes differing topologically among MPTs

  14. Testing Evolutionary Hypotheses in the Classroom with MacClade Software.

    ERIC Educational Resources Information Center

    Codella, Sylvio G.

    2002-01-01

    Introduces MacClade which is a Macintosh-based software package that uses the techniques of cladistic analysis to explore evolutionary patterns. Describes a novel and effective exercise that allows undergraduate biology majors to test a hypothesis about behavioral evolution in insects. (Contains 13 references.) (Author/YDS)

  15. Cladistic Analysis of Olfactory and Vomeronasal Systems

    PubMed Central

    Ubeda-Bañon, Isabel; Pro-Sistiaga, Palma; Mohedano-Moriano, Alicia; Saiz-Sanchez, Daniel; de la Rosa-Prieto, Carlos; Gutierrez-Castellanos, Nicolás; Lanuza, Enrique; Martinez-Garcia, Fernando; Martinez-Marcos, Alino

    2010-01-01

    Most tetrapods possess two nasal organs for detecting chemicals in their environment, which are the sensory detectors of the olfactory and vomeronasal systems. The seventies’ view that the olfactory system was only devoted to sense volatiles, whereas the vomeronasal system was exclusively specialized for pheromone detection was challenged by accumulating data showing deep anatomical and functional interrelationships between both systems. In addition, the assumption that the vomeronasal system appeared as an adaptation to terrestrial life is being questioned as well. The aim of the present work is to use a comparative strategy to gain insight in our understanding of the evolution of chemical “cortex.” We have analyzed the organization of the olfactory and vomeronasal cortices of reptiles, marsupials, and placental mammals and we have compared our findings with data from other taxa in order to better understand the evolutionary history of the nasal sensory systems in vertebrates. The olfactory and vomeronsasal cortices have been re-investigated in garter snakes (Thamnophis sirtalis), short-tailed opossums (Monodelphis domestica), and rats (Rattus norvegicus) by tracing the efferents of the main and accessory olfactory bulbs using injections of neuroanatomical anterograde tracers (dextran-amines). In snakes, the medial olfactory tract is quite evident, whereas the main vomeronasal-recipient structure, the nucleus sphaericus is a folded cortical-like structure, located at the caudal edge of the amygdala. In marsupials, which are acallosal mammals, the rhinal fissure is relatively dorsal and the olfactory and vomeronasal cortices relatively expanded. Placental mammals, like marsupials, show partially overlapping olfactory and vomeronasal projections in the rostral basal telencephalon. These data raise the interesting question of how the telencephalon has been re-organized in different groups according to the biological relevance of chemical senses. PMID:21290004

  16. Cladistic analysis of olfactory and vomeronasal systems.

    PubMed

    Ubeda-Bañon, Isabel; Pro-Sistiaga, Palma; Mohedano-Moriano, Alicia; Saiz-Sanchez, Daniel; de la Rosa-Prieto, Carlos; Gutierrez-Castellanos, Nicolás; Lanuza, Enrique; Martinez-Garcia, Fernando; Martinez-Marcos, Alino

    2011-01-01

    Most tetrapods possess two nasal organs for detecting chemicals in their environment, which are the sensory detectors of the olfactory and vomeronasal systems. The seventies' view that the olfactory system was only devoted to sense volatiles, whereas the vomeronasal system was exclusively specialized for pheromone detection was challenged by accumulating data showing deep anatomical and functional interrelationships between both systems. In addition, the assumption that the vomeronasal system appeared as an adaptation to terrestrial life is being questioned as well. The aim of the present work is to use a comparative strategy to gain insight in our understanding of the evolution of chemical "cortex." We have analyzed the organization of the olfactory and vomeronasal cortices of reptiles, marsupials, and placental mammals and we have compared our findings with data from other taxa in order to better understand the evolutionary history of the nasal sensory systems in vertebrates. The olfactory and vomeronsasal cortices have been re-investigated in garter snakes (Thamnophis sirtalis), short-tailed opossums (Monodelphis domestica), and rats (Rattus norvegicus) by tracing the efferents of the main and accessory olfactory bulbs using injections of neuroanatomical anterograde tracers (dextran-amines). In snakes, the medial olfactory tract is quite evident, whereas the main vomeronasal-recipient structure, the nucleus sphaericus is a folded cortical-like structure, located at the caudal edge of the amygdala. In marsupials, which are acallosal mammals, the rhinal fissure is relatively dorsal and the olfactory and vomeronasal cortices relatively expanded. Placental mammals, like marsupials, show partially overlapping olfactory and vomeronasal projections in the rostral basal telencephalon. These data raise the interesting question of how the telencephalon has been re-organized in different groups according to the biological relevance of chemical senses.

  17. On the use of haplotype phylogeny to detect disease susceptibility loci

    PubMed Central

    Bardel, Claire; Danjean, Vincent; Hugot, Jean-Pierre; Darlu, Pierre; Génin, Emmanuelle

    2005-01-01

    Background The cladistic approach proposed by Templeton has been presented as promising for the study of the genetic factors involved in common diseases. This approach allows the joint study of multiple markers within a gene by considering haplotypes and grouping them in nested clades. The idea is to search for clades with an excess of cases as compared to the whole sample and to identify the mutations defining these clades as potential candidate disease susceptibility sites. However, the performance of this approach for the study of the genetic factors involved in complex diseases has never been studied. Results In this paper, we propose a new method to perform such a cladistic analysis and we estimate its power through simulations. We show that under models where the susceptibility to the disease is caused by a single genetic variant, the cladistic test is neither really more powerful to detect an association nor really more efficient to localize the susceptibility site than an individual SNP testing. However, when two interacting sites are responsible for the disease, the cladistic analysis greatly improves the probability to find the two susceptibility sites. The impact of the linkage disequilibrium and of the tree characteristics on the efficiency of the cladistic analysis are also discussed. An application on a real data set concerning the CARD15 gene and Crohn disease shows that the method can successfully identify the three variant sites that are involved in the disease susceptibility. Conclusion The use of phylogenies to group haplotypes is especially interesting to pinpoint the sites that are likely to be involved in disease susceptibility among the different markers identified within a gene. PMID:15904492

  18. Larvae of the genus Eleodes (Coleoptera, Tenebrionidae): matrix-based descriptions, cladistic analysis, and key to late instars

    PubMed Central

    Smith, Aaron D.; Dornburg, Rebecca; Wheeler, Quentin D.

    2014-01-01

    Abstract Darkling beetle larvae (Coleoptera, Tenebrionidae) are collectively referred to as false wireworms. Larvae from several species in the genus Eleodes are considered to be agricultural pests, though relatively little work has been done to associate larvae with adults of the same species and only a handful of species have been characterized in their larval state. Morphological characters from late instar larvae were examined and coded to produce a matrix in the server-based content management system mx. The resulting morphology matrix was used to produce larval species descriptions, reconstruct a phylogeny, and build a key to the species included in the matrix. Larvae are described for the first time for the following 12 species: Eleodes anthracinus Blaisdell, Eleodes carbonarius (Say), Eleodes caudiferus LeConte, Eleodes extricatus (Say), Eleodes goryi Solier, Eleodes hispilabris (Say), Eleodes nigropilosus LeConte, Eleodes pilosus Horn, Eleodes subnitens LeConte, Eleodes tenuipes Casey, Eleodes tribulus Thomas, and Eleodes wheeleri Aalbu, Smith & Triplehorn. The larval stage of Eleodes armatus LeConte is redescribed with additional characters to differentiate it from the newly described congeneric larvae. PMID:25009429

  19. Re-writing Popper's philosophy of science for systematics.

    PubMed

    Rieppel, Olivier

    2008-01-01

    This paper explores the use of Popper's philosophy of science by cladists in their battle against evolutionary and numerical taxonomy. Three schools of biological systematics fiercely debated each other from the late 1960s: evolutionary taxonomy, phenetics or numerical taxonomy, and phylogenetic systematics or cladistics. The outcome of that debate was the victory of phylogenetic systematics/cladistics over the competing schools of thought. To bring about this "cladistic turn" in systematics, the cladists drew heavily on the philosopher K.R. Popper in order to dress up phylogenetic systematics as a hypothetico-deductivist, indeed falsificationist, research program that would put an end to authoritarianism. As the case of the "cladistic revolution" demonstrates, scientists who turn to philosophy in defense of a research program read philosophers with an agenda in mind. That agenda is likely to distort the philosophical picture, as happened to Popper's philosophy of science at the hands of cladists.

  20. Phylogenetic Analyses of Quasars and Galaxies

    NASA Astrophysics Data System (ADS)

    Fraix-Burnet, Didier; D'Onofrio, Mauro; Marziani, Paola

    2017-10-01

    Phylogenetic approaches have proven to be useful in astrophysics. We have recently published a Maximum Parsimony (or cladistics) analysis on two samples of 215 and 85 low-z quasars (z < 0.7) which offer a satisfactory coverage of the Eigenvector 1-derived main sequence. Cladistics is not only able to group sources radiating at higher Eddington ratios, to separate radio-quiet (RQ) and radio-loud (RL) quasars and properly distinguishes core-dominated and lobe-dominated quasars, but it suggests a black hole mass threshold for powerful radio emission as already proposed elsewhere. An interesting interpretation from this work is that the phylogeny of quasars may be represented by the ontogeny of their central black hole, i.e. the increase of the black hole mass. However these exciting results are based on a small sample of low-z quasars, so that the work must be extended. We are here faced with two difficulties. The first one is the current lack of a larger sample with similar observables. The second one is the prohibitive computation time to perform a cladistic analysis on more that about one thousand objects. We show in this paper an experimental strategy on about 1500 galaxies to get around this difficulty. Even if it not related to the quasar study, it is interesting by itself and opens new pathways to generalize the quasar findings.

  1. A phylogenetic analysis of the Gruiformes (Aves) based on morphological characters, with an emphasis on the rails (Rallidae)

    PubMed Central

    C.Livezey, B.

    1998-01-01

    The order Gruiformes, for which even familial composition remains controversial, is perhaps the least well understood avian order from a phylogenetic perspective. The history of the systematics of the order is presented, and the ecological and biogeographic characteristics of its members are summarized. Using cladistic techniques, phylogenetic relationships among fossil and modern genera of the Gruiformes were estimated based on 381 primarily osteological characters; relationships among modern species of Grues (Psophiidae, Aramidae, Gruidae, Heliornithidae and Rallidae) were assessed based on these characters augmented by 189 characters of the definitive integument. A strict consensus tree for 20,000 shortest trees compiled for the matrix of gruiform genera (length = 967, CI = 0.517) revealed a number of nodes common to the solution set, many of which were robust to bootstrapping and had substantial support (Bremer) indices. Robust nodes included those supporting: a sister relationship between the Pedionomidae and Turnicidae; monophyly of the Gruiformes exclusive of the Pedionomidae and Turnicidae; a sister relationship between the Cariamidae and Phorusrhacoidea; a sister relationship between a clade comprising Eurypyga and Messelornis and one comprising Rhynochetos and Aptornis; monophyly of the Grues (Psophiidae, Aramidae, Gruidae, Heliornithidae and Rallidae); monophyly of a clade (Gruoidea) comprising (in order of increasingly close relationship) Psophia, Aramus, Balearica and other Gruidae, with monophyly of each member in this series confirmed; a sister relationship between the Heliornithidae and Rallidae; and monophyly of the Rallidae exclusive of Himantornis. Autapomorphic divergence was comparatively high for Pedionomus, Eurypyga, Psophia, Himantornis and Fulica; extreme autapomorphy, much of which is unique for the order, characterized the extinct, flightless Aptornis. In the species-level analysis of modern Grues, special efforts were made to limit the

  2. Phylogeny, host-parasite relationship and zoogeography

    PubMed Central

    1999-01-01

    Phylogeny is the evolutionary history of a group or the lineage of organisms and is reconstructed based on morphological, molecular and other characteristics. The genealogical relationship of a group of taxa is often expressed as a phylogenetic tree. The difficulty in categorizing the phylogeny is mainly due to the existence of frequent homoplasies that deceive observers. At the present time, cladistic analysis is believed to be one of the most effective methods of reconstructing a phylogenetic tree. Excellent computer program software for phylogenetic analysis is available. As an example, cladistic analysis was applied for nematode genera of the family Acuariidae, and the phylogenetic tree formed was compared with the system used currently. Nematodes in the genera Nippostrongylus and Heligmonoides were also analyzed, and the validity of the reconstructed phylogenetic trees was observed from a zoogeographical point of view. Some of the theories of parasite evolution were briefly reviewed as well. Coevolution of parasites and humans was discussed with special reference to the evolutionary relationship between Enterobius and primates. PMID:10634036

  3. Use of Parsimony Analysis to Identify Areas of Endemism of Chinese Birds: Implications for Conservation and Biogeography

    PubMed Central

    Huang, Xiao-Lei; Qiao, Ge-Xia; Lei, Fu-Min

    2010-01-01

    Parsimony analysis of endemicity (PAE) was used to identify areas of endemism (AOEs) for Chinese birds at the subregional level. Four AOEs were identified based on a distribution database of 105 endemic species and using 18 avifaunal subregions as the operating geographical units (OGUs). The four AOEs are the Qinghai-Zangnan Subregion, the Southwest Mountainous Subregion, the Hainan Subregion and the Taiwan Subregion. Cladistic analysis of subregions generally supports the division of China’s avifauna into Palaearctic and Oriental realms. Two PAE area trees were produced from two different distribution datasets (year 1976 and 2007). The 1976 topology has four distinct subregional branches; however, the 2007 topology has three distinct branches. Moreover, three Palaearctic subregions in the 1976 tree clustered together with the Oriental subregions in the 2007 tree. Such topological differences may reflect changes in the distribution of bird species through circa three decades. PMID:20559504

  4. Two new species of Chaco Tullgren from the Atlantic coast of Uruguay (Araneae, Mygalomorphae, Nemesiidae).

    PubMed

    de Oca, Laura Montes; Pérez-Miles, Fernando

    2013-01-01

    We describe two new species of the nemesiid spider genus Chaco from Rocha Province, Uruguay. These new species are diagnosed based on genital morphology, male tibial apophysis spination, and burrow entrance. We test cospecificity of one species, Chaco costai,via laboratory mating experiments. The new species are diagnosed and illustrated and habitat characteristics, and capture behavior are described. We conduct a cladistic analysis based on a previously published morphological character matrix that now includes the newly described species.

  5. A revision and phylogenetic analysis of the spider genus Oxysoma Nicolet (Araneae: Anyphaenidae, Amaurobioidinae).

    PubMed

    Aisen, Santiago; Ramírez, Martín J

    2015-08-06

    We review the spider genus Oxysoma Nicolet, with most of its species endemic from the southern temperate forests in Chile and Argentina, and present a phylogenetic analysis including seven species, of which three are newly described in this study (O. macrocuspis new species, O. kuni new species, and O. losruiles new species, all from Chile), together with other 107 representatives of Anyphaenidae. New geographical records and distribution maps are provided for all species, with illustrations and reviewed diagnoses for the genus and the four previously known species (O. punctatum Nicolet, O. saccatum (Tullgren), O. longiventre (Nicolet) and O. itambezinho Ramírez). The phylogenetic analysis using cladistic methods is based on 264 previously defined characters plus one character that arises from this study. The three new species are closely related with Oxysoma longiventre, and this four species compose what we define as the Oxysoma longiventre species group. The phylogenetic analysis did not retrieve the monophyly of Oxysoma, which should be reevaluated in the future, together with the genus Tasata.

  6. The phylogeny of quasars and the ontogeny of their central black holes

    NASA Astrophysics Data System (ADS)

    Fraix-Burnet, Didier; Marziani, Paola; D'Onofrio, Mauro; Dultzin, Deborah

    2017-02-01

    The connection between multifrequency quasar observational and physical parameters related to accretion processes is still open to debate. In the last 20 year, Eigenvector 1-based approaches developed since the early papers by Boroson and Green (1992) and Sulentic et al. (2000b) have been proved to be a remarkably powerful tool to investigate this issue, and have led to the definition of a quasar "main sequence". In this paper we perform a cladistic analysis on two samples of 215 and 85 low-z quasars (z ~ 0.7) which were studied in several previous works and which offer a satisfactory coverage of the Eigenvector 1-derived main sequence. The data encompass accurate measurements of observational parameters which represents key aspects associated with the structural diversity of quasars. Cladistics is able to group sources radiating at higher Eddington ratios, as well as to separate radio-quiet (RQ) and radio-loud (RL) quasars. The analysis suggests a black hole mass threshold for powerful radio emission and also properly distinguishes core-dominated and lobe-dominated quasars, in accordance with the basic tenet of RL unification schemes. Considering that black hole mass provides a sort of "arrow of time" of nuclear activity, a phylogenetic interpretation becomes possible if cladistic trees are rooted on black hole mass: the ontogeny of black holes is represented by their monotonic increase in mass. More massive radio-quiet Population B sources at low-z become a more evolved counterpart of Population A i.e., wind dominated sources to which the "local" Narrow-Line Seyfert 1s belong.

  7. Two new species of Chaco Tullgren from the Atlantic coast of Uruguay (Araneae, Mygalomorphae, Nemesiidae)

    PubMed Central

    de Oca, Laura Montes; Pérez-Miles, Fernando

    2013-01-01

    Abstract We describe two new species of the nemesiid spider genus Chaco from Rocha Province, Uruguay. These new species are diagnosed based on genital morphology, male tibial apophysis spination, and burrow entrance. We test cospecificity of one species, Chaco costai,via laboratory mating experiments. The new species are diagnosed and illustrated and habitat characteristics, and capture behavior are described. We conduct a cladistic analysis based on a previously published morphological character matrix that now includes the newly described species. PMID:24146579

  8. Review of amphipods of the Melita group (Amphipoda: Melitidae) from the coastal waters of Sakhalin Island (Far East of Russia). II. Genera Quasimelita Jarrett & Bousfield, 1996 and Melitoides Gurjanova, 1934.

    PubMed

    Labay, Vjacheslav S

    2014-10-01

    Based on new material, three new species of the genus Quasimelita are described: Q. tolyza sp. nov., Q. jarettii sp. nov. and Q. serraticoxae sp. nov. from northern shelf of Sakhalin Island and contiguous area. The new species of the genus Melitoides, M. kawaii sp. nov. is described from north-east shelf of Sakhalin Island. Keys to the world species of genera Quasimelita and Melitoides are provided. Cladistic analysis of morphological relationships within genera Quasimelita and Melitoides are implemented.

  9. A New Paleozoic Symmoriiformes (Chondrichthyes) from the Late Carboniferous of Kansas (USA) and Cladistic Analysis of Early Chondrichthyans

    PubMed Central

    Pradel, Alan; Tafforeau, Paul; Maisey, John G.; Janvier, Philippe

    2011-01-01

    Background The relationships of cartilaginous fishes are discussed in the light of well preserved three-dimensional Paleozoic specimens. There is no consensus to date on the interrelationship of Paleozoic chondrichthyans, although three main phylogenetic hypotheses exist in the current literature: 1. the Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are grouped along with the modern sharks (neoselachians) into a clade which is sister group of holocephalans; 2. the Symmoriiformes are related to holocephalans, whereas the other Paleozoic shark-like chondrichthyans are related to neoselachians; 3. many Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are stem chondrichthyans, whereas stem and crown holocephalans are sister group to the stem and crown neoselachians in a crown-chondrichthyan clade. This third hypothesis was proposed recently, based mainly on dental characters. Methodology/Principal Findings On the basis of two well preserved chondrichthyan neurocrania from the Late Carboniferous of Kansas, USA, we describe here a new species of Symmoriiformes, Kawichthys moodiei gen. et sp. nov., which was investigated by means of computerized X-ray synchrotron microtomography. We present a new phylogenetic analysis based on neurocranial characters, which supports the third hypothesis and corroborates the hypothesis that crown-group chondrichthyans (Holocephali+Neoselachii) form a tightly-knit group within the chondrichthyan total group, by providing additional, non dental characters. Conclusions/Significance Our results highlight the importance of new well preserved Paleozoic fossils and new techniques of observation, and suggest that a new look at the synapomorphies of the crown-group chondrichthyans would be worthwhile in terms of understanding the adaptive significance of phylogenetically important characters. PMID:21980367

  10. A new paleozoic Symmoriiformes (Chondrichthyes) from the late Carboniferous of Kansas (USA) and cladistic analysis of early chondrichthyans.

    PubMed

    Pradel, Alan; Tafforeau, Paul; Maisey, John G; Janvier, Philippe

    2011-01-01

    The relationships of cartilaginous fishes are discussed in the light of well preserved three-dimensional Paleozoic specimens. There is no consensus to date on the interrelationship of Paleozoic chondrichthyans, although three main phylogenetic hypotheses exist in the current literature: 1. the Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are grouped along with the modern sharks (neoselachians) into a clade which is sister group of holocephalans; 2. the Symmoriiformes are related to holocephalans, whereas the other Paleozoic shark-like chondrichthyans are related to neoselachians; 3. many Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are stem chondrichthyans, whereas stem and crown holocephalans are sister group to the stem and crown neoselachians in a crown-chondrichthyan clade. This third hypothesis was proposed recently, based mainly on dental characters. On the basis of two well preserved chondrichthyan neurocrania from the Late Carboniferous of Kansas, USA, we describe here a new species of Symmoriiformes, Kawichthys moodiei gen. et sp. nov., which was investigated by means of computerized X-ray synchrotron microtomography. We present a new phylogenetic analysis based on neurocranial characters, which supports the third hypothesis and corroborates the hypothesis that crown-group chondrichthyans (Holocephali+Neoselachii) form a tightly-knit group within the chondrichthyan total group, by providing additional, non dental characters. Our results highlight the importance of new well preserved Paleozoic fossils and new techniques of observation, and suggest that a new look at the synapomorphies of the crown-group chondrichthyans would be worthwhile in terms of understanding the adaptive significance of phylogenetically important characters.

  11. Taxonomic revision and cladistic analysis of Avicularia Lamarck, 1818 (Araneae, Theraphosidae, Aviculariinae) with description of three new aviculariine genera

    PubMed Central

    Fukushima, Caroline Sayuri; Bertani, Rogério

    2017-01-01

    Abstract The genus Avicularia Lamarck, 1818 is revised and all species are rediagnosed. The type species, described as Aranea avicularia Linnaeus, 1758, is the oldest mygalomorph species described and its taxonomic history is extensive and confusing. Cladistic analyses using both equal and implied weights were carried out with a matrix of 46 taxa from seven theraphosid subfamilies, and 71 morphological and ecological characters. The optimal cladogram found with Piwe and concavity = 6 suggests Avicularia and Aviculariinae are monophyletic. Subfamily Aviculariinae includes Avicularia Lamarck, 1818, Typhochlaena C. L. Koch, 1850, Tapinauchenius Ausserer, 1871, Stromatopelma Karsch, 1881, Ephebopus Simon, 1892, Psalmopoeus Pocock, 1895, Heteroscodra Pocock, 1899, Iridopelma Pocock, 1901, Pachistopelma Pocock, 1901, Ybyrapora gen. n., Caribena gen. n., and Antillena gen. n. The clade is supported by well-developed scopulae on tarsi and metatarsi, greatly extended laterally. Avicularia synapomorphies are juveniles bearing black tarsi contrasting with other lighter articles; spermathecae with an accentuated outwards curvature medially, and male palpal bulb with embolus medial portion and tegulum’s margin form an acute angle in retrolateral view. Avicularia is composed of twelve species, including three new species: Avicularia avicularia (Linnaeus, 1818), Avicularia glauca Simon, 1891, Avicularia variegata (F. O. Pickard-Cambridge, 1896) stat. n., Avicularia minatrix Pocock, 1903, Avicularia taunayi (Mello-Leitão, 1920), Avicularia juruensis Mello-Leitão, 1923, Avicularia rufa Schiapelli & Gerschman, 1945, Avicularia purpurea Kirk, 1990, Avicularia hirschii Bullmer et al. 2006, Avicularia merianae sp. n., Avicularia lynnae sp. n., and Avicularia caei sp. n.. Avicularia species are distributed throughout Mexico, Costa Rica, Panama, Trinidad and Tobago, Venezuela, Guyana, Suriname, French Guiana, Colombia, Ecuador, Peru, Bolivia, and Brazil. Three new genera are erected

  12. Soft-tissue anatomy of the primates: phylogenetic analyses based on the muscles of the head, neck, pectoral region and upper limb, with notes on the evolution of these muscles

    PubMed Central

    Diogo, R; Wood, B

    2011-01-01

    Apart from molecular data, nearly all the evidence used to study primate relationships comes from hard tissues. Here, we provide details of the first parsimony and Bayesian cladistic analyses of the order Primates based exclusively on muscle data. The most parsimonious tree obtained from the cladistic analysis of 166 characters taken from the head, neck, pectoral and upper limb musculature is fully congruent with the most recent evolutionary molecular tree of Primates. That is, this tree recovers not only the relationships among the major groups of primates, i.e. Strepsirrhini {Tarsiiformes [Platyrrhini (Cercopithecidae, Hominoidea)]}, but it also recovers the relationships within each of these inclusive groups. Of the 301 character state changes occurring in this tree, ca. 30% are non-homoplasic evolutionary transitions; within the 220 changes that are unambiguously optimized in the tree, ca. 15% are reversions. The trees obtained by using characters derived from the muscles of the head and neck are more similar to the most recent evolutionary molecular tree than are the trees obtained by using characters derived from the pectoral and upper limb muscles. It was recently argued that since the Pan/Homo split, chimpanzees accumulated more phenotypic adaptations than humans, but our results indicate that modern humans accumulated more muscle character state changes than chimpanzees, and that both these taxa accumulated more changes than gorillas. This overview of the evolution of the primate head, neck, pectoral and upper limb musculature suggests that the only muscle groups for which modern humans have more muscles than most other extant primates are the muscles of the face, larynx and forearm. PMID:21689100

  13. Soft-tissue anatomy of the primates: phylogenetic analyses based on the muscles of the head, neck, pectoral region and upper limb, with notes on the evolution of these muscles.

    PubMed

    Diogo, R; Wood, B

    2011-09-01

    Apart from molecular data, nearly all the evidence used to study primate relationships comes from hard tissues. Here, we provide details of the first parsimony and Bayesian cladistic analyses of the order Primates based exclusively on muscle data. The most parsimonious tree obtained from the cladistic analysis of 166 characters taken from the head, neck, pectoral and upper limb musculature is fully congruent with the most recent evolutionary molecular tree of Primates. That is, this tree recovers not only the relationships among the major groups of primates, i.e. Strepsirrhini {Tarsiiformes [Platyrrhini (Cercopithecidae, Hominoidea)]}, but it also recovers the relationships within each of these inclusive groups. Of the 301 character state changes occurring in this tree, ca. 30% are non-homoplasic evolutionary transitions; within the 220 changes that are unambiguously optimized in the tree, ca. 15% are reversions. The trees obtained by using characters derived from the muscles of the head and neck are more similar to the most recent evolutionary molecular tree than are the trees obtained by using characters derived from the pectoral and upper limb muscles. It was recently argued that since the Pan/Homo split, chimpanzees accumulated more phenotypic adaptations than humans, but our results indicate that modern humans accumulated more muscle character state changes than chimpanzees, and that both these taxa accumulated more changes than gorillas. This overview of the evolution of the primate head, neck, pectoral and upper limb musculature suggests that the only muscle groups for which modern humans have more muscles than most other extant primates are the muscles of the face, larynx and forearm. © 2011 The Authors. Journal of Anatomy © 2011 Anatomical Society of Great Britain and Ireland.

  14. Oldest near-complete acanthodian: the first vertebrate from the Silurian Bertie Formation Konservat-Lagerstätte, Ontario.

    PubMed

    Burrow, Carole J; Rudkin, David

    2014-01-01

    The relationships between early jawed vertebrates have been much debated, with cladistic analyses yielding little consensus on the position (or positions) of acanthodians with respect to other groups. Whereas one recent analysis showed various acanthodians (classically known as 'spiny sharks') as stem osteichthyans (bony fishes) and others as stem chondrichthyans, another shows the acanthodians as a paraphyletic group of stem chondrichthyans, and the latest analysis shows acanthodians as the monophyletic sister group of the Chondrichthyes. A small specimen of the ischnacanthiform acanthodian Nerepisacanthus denisoni is the first vertebrate fossil collected from the Late Silurian Bertie Formation Konservat-Lagerstätte of southern Ontario, Canada, a deposit well-known for its spectacular eurypterid fossils. The fish is the only near complete acanthodian from pre-Devonian strata worldwide, and confirms that Nerepisacanthus has dentigerous jaw bones, body scales with superposed crown growth zones formed of ondontocytic mesodentine, and a patch of chondrichthyan-like scales posterior to the jaw joint. The combination of features found in Nerepisacanthus supports the hypothesis that acanthodians could be a group, or even a clade, on the chondrichthyan stem. Cladistic analyses of early jawed vertebrates incorporating Nerepisacanthus, and updated data on other acanthodians based on publications in press, should help clarify their relationships.

  15. On the Semilattice of Weak Orders of a Set.

    DTIC Science & Technology

    1984-03-01

    the analvsis of cladistic character compatibi litv, ibid., 29 (1976), pp. 181-187. 17] ........ An alge-braic analvsis of cladistic characters, Discrete ... Math ., 16 (1976), pp. 141-147. (Sj ESTABROOK, G. F., and Ic.ORRIS, F. R., When are two pi_ alita tive t axonomi c-har-acters comnpIat_ible? J1. Math

  16. Root Character Evolution and Systematics in Cranichidinae, Prescottiinae and Spiranthinae (Orchidaceae, Cranichideae)

    PubMed Central

    Figueroa, Coyolxauhqui; Salazar, Gerardo A.; Zavaleta, H. Araceli; Engleman, E. Mark

    2008-01-01

    Background and Aims Previous studies have suggested that velamen characteristics are useful as taxonomic markers in Orchidaceae. Members of tribe Cranichideae have been assigned to two velamen types constructed based on combinations of characters such as the presence of secondary cell-wall thickenings and pores. However, such characters have not been analysed on an individual basis in explicit cladistic analyses. Methods The micromorphology of roots of 26 species of Cranichideae was examined through scanning electron microscopy and light microscopy, scoring the variation and distribution of four characters: number of velamen cell layers, velamen cell-wall thickenings, presence and type of tilosomes, and supraendodermal spaces. The last three characters were analysed cladistically in combination with DNA sequence data of plastid trnK/matK and nuclear ribosomal internal transcribed spacer (ITS) regions and optimized on the resulting phylogenetic tree. Key Results Thickenings of velamen cell walls group Prescottiinae with Spiranthinae, whereas tilosomes, documented here for the first time in Cranichideae, provide an unambiguous synapomorphy for subtribe Spiranthinae. Supraendodermal spaces occur mostly in species dwelling in seasonally dry habitats and appear to have evolved three times. Conclusions Three of the four structural characters assessed are phylogenetically informative, marking monophyletic groups recovered in the combined molecular–morphological analysis. This study highlights the need for conducting character-based structural studies to overcome analytical shortcomings of the typological approach. PMID:18263628

  17. Building a Twig Phylogeny

    ERIC Educational Resources Information Center

    Flinn, Kathryn M.

    2015-01-01

    In this classroom activity, students build a phylogeny for woody plant species based on the morphology of their twigs. Using any available twigs, students can practice the process of cladistics to test evolutionary hypotheses for real organisms. They identify homologous characters, determine polarity through outgroup comparison, and construct a…

  18. Skeletons of terrestrial cetaceans and the relationship of whales to artiodactyls.

    PubMed

    Thewissen, J G; Williams, E M; Roe, L J; Hussain, S T

    2001-09-20

    Modern members of the mammalian order Cetacea (whales, dolphins and porpoises) are obligate aquatic swimmers that are highly distinctive in morphology, lacking hair and hind limbs, and having flippers, flukes, and a streamlined body. Eocene fossils document much of cetaceans' land-to-water transition, but, until now, the most primitive representative for which a skeleton was known was clearly amphibious and lived in coastal environments. Here we report on the skeletons of two early Eocene pakicetid cetaceans, the fox-sized Ichthyolestes pinfoldi, and the wolf-sized Pakicetus attocki. Their skeletons also elucidate the relationships of cetaceans to other mammals. Morphological cladistic analyses have shown cetaceans to be most closely related to one or more mesonychians, a group of extinct, archaic ungulates, but molecular analyses have indicated that they are the sister group to hippopotamids. Our cladistic analysis indicates that cetaceans are more closely related to artiodactyls than to any mesonychian. Cetaceans are not the sister group to (any) mesonychians, nor to hippopotamids. Our analysis stops short of identifying any particular artiodactyl family as the cetacean sister group and supports monophyly of artiodactyls.

  19. Molecular phylogenetics, diversification, and systematics of Tibicen Latreille 1825 and allied cicadas of the tribe Cryptotympanini, with three new genera and emphasis on species from the USA and Canada
    (Hemiptera: Auchenorrhyncha: Cicadidae).

    PubMed

    Hill, Kathy B R; Marshall, David C; Moulds, Maxwell S; Simon, Chris

    2015-07-10

    North America has a diverse cicada fauna with multiple genera from all three Cicadidae subfamilies, yet molecular phylogenetic analyses have been completed only for the well-studied periodical cicadas (Magicicada Davis). The genus Tibicen Latreille, a large group of charismatic species, is in need of such work because morphological patterns suggest multiple groups with complicated relationships to other genera in the tribe Cryptotympanini. In this paper we present a molecular phylogenetic analysis, based on mitochondrial and nuclear DNA, of 35 of the 38 extant USA species and subspecies of the genus Tibicen together with their North American tribal allies (Cornuplura Davis, Cacama Davis), selected Tibicen species from Eurasia, and representatives of other Eurasian and Pacific cryptotympanine genera. This tree shows that Tibicen contains several well-supported clades, one predominating in eastern and central North America and related to Cryptotympana Stål and Raiateana Boulard, another in western North America related to Cacama and Cornuplura, and at least two clades in Eurasia. We also present a morphological cladistic analysis of Tibicen and its close allies based on 27 characters. Character states identified in the cladistic analysis define three new genera, two for North American taxa (Hadoa gen. n. and Neotibicen gen. n.) including several Mexican species, and one for Asian species (Subsolanus gen. n.). Using relaxed molecular clocks and literature-derived mtDNA rate estimates, we estimate the timeframe of diversification of Tibicen clades and find that intergeneric divergence has occurred since the late Eocene, with most extant species within the former Tibicen originating after the mid-Miocene. We review patterns of ecology, behavior, and geography among Tibicen clades in light of the phylogenetic results and note that the study of these insects is still in its early stages. Some Mexican species formerly placed in Tibicen are here transferred to Diceroprocta

  20. Diplocephalus komposchi n. sp., a new species of erigonine spider (Araneae, Linyphiidae) from Austria.

    PubMed

    Milasowszky, Norbert; Bauder, Julia; Hepner, Martin

    2017-05-16

    The erigonine cladistic analyses of Hormiga (2000) and Miller & Hormiga (2004) demonstrated unambiguous support for a sister-taxon relationship between the genera Diplocephalus and Savignia. These genera, in addition to others, are commonly placed in the Savignia-group. Although the Savignia-group is not monophyletic as it was originally circumscribed by Millidge (1977), it contains a monophyletic core of genera that has been supported in various cladistic analyses, starting with Hormiga (2000). According to the most recent phylogenetic study (Frick et al. 2010), a clade within the Savignia-group included Diplocephalus along with Araeoncus, Dicymbium, Erigonella, Glyphesis and Savignia. Frick et al. (2010) included three Diplocephalus species - cristatus, latifrons and picinus - in their cladistic analyses. While D. latifrons and D. picinus were found to be the most basal species of the Savignia-group, D. cristatus was the most distal one.

  1. Cladistic biogeography of Juglans (Juglandaceae) based on chloroplast DNA intergenic spacer sequences

    USDA-ARS?s Scientific Manuscript database

    The phylogenetic utility of sequence variation from five chloroplast DNA intergenic spacer (IGS) regions: trnT-trnF, psbA-trnH, atpB-rbcL, trnV-16S rRNA, and trnS-trnfM was examined in the genus Juglans. A total of seventeen taxa representing the four sections within Juglans and an outgroup taxon, ...

  2. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    PubMed

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  3. Primates and their pinworm parasites: the cameron hypothesis revisited.

    PubMed

    Hugot, J P

    1999-09-01

    A morphologically based cladistic analysis of the Enterobiinae, which includes most of the Oxyuridae parasitic in Primates, allows a reevaluation of the Cameron's hypothesis of close coevolution with cospeciation between hosts and parasites. Each of the three genera separated in the Enterobiinae fits with one of the suborders defined in Primates: Lemuricola with the Strepsirhini, Trypanoxyuris with the Platyrrhini, and Enterobius with the Catarrhini. Inside each of the three main groups, the subdivisions observed in the parasite tree also fit with many of the subdivisions generally accepted within the Primate order. These results confirm the subgroups previously described in the subfamily and support Cameron's hypothesis in its aspect of association by descent. Although the classification of the Enterobiinae generally closely underlines the classification of Primates, several discordances also are observed. These are discussed case by case, with use of computed reconstruction scenarios. Given that the occurrences of the same pinworm species as a parasite for several congeneric host species is not the generalized pattern, and given that several occurrences also are observed in which the speciations of the parasites describe a more complex network, Cameron's hypothesis of a slower rhythm of speciation in the parasites can be considered partly refuted. The presence of two genera parasitic on squirrels in a family that contains primarily primate parasites also is discussed. The cladistic analysis does not support close relationships between the squirrel parasites and suggests an early separation from the Enterobiinae for the first (Xeroxyuris), and a tardy host-switching from the Platyrrhini to the squirrels for the second (Rodentoxyuris).

  4. Towards a molecular taxonomic key of the Aurantioideae subfamily using chloroplastic SNP diagnostic markers of the main clades genotyped by competitive allele-specific PCR.

    PubMed

    Oueslati, Amel; Ollitrault, Frederique; Baraket, Ghada; Salhi-Hannachi, Amel; Navarro, Luis; Ollitrault, Patrick

    2016-08-18

    Chloroplast DNA is a primary source of molecular variations for phylogenetic analysis of photosynthetic eukaryotes. However, the sequencing and analysis of multiple chloroplastic regions is difficult to apply to large collections or large samples of natural populations. The objective of our work was to demonstrate that a molecular taxonomic key based on easy, scalable and low-cost genotyping method should be developed from a set of Single Nucleotide Polymorphisms (SNPs) diagnostic of well-established clades. It was applied to the Aurantioideae subfamily, the largest group of the Rutaceae family that includes the cultivated citrus species. The publicly available nucleotide sequences of eight plastid genomic regions were compared for 79 accessions of the Aurantioideae subfamily to search for SNPs revealing taxonomic differentiation at the inter-tribe, inter-subtribe, inter-genus and interspecific levels. Diagnostic SNPs (DSNPs) were found for 46 of the 54 clade levels analysed. Forty DSNPs were selected to develop KASPar markers and their taxonomic value was tested by genotyping 108 accessions of the Aurantioideae subfamily. Twenty-seven markers diagnostic of 24 clades were validated and they displayed a very high rate of transferability in the Aurantioideae subfamily (only 1.2 % of missing data on average). The UPGMA from the validated markers produced a cladistic organisation that was highly coherent with the previous phylogenetic analysis based on the sequence data of the eight plasmid regions. In particular, the monophyletic origin of the "true citrus" genera plus Oxanthera was validated. However, some clarification remains necessary regarding the organisation of the other wild species of the Citreae tribe. We validated the concept that with well-established clades, DSNPs can be selected and efficiently transformed into competitive allele-specific PCR markers (KASPar method) allowing cost-effective highly efficient cladistic analysis in large collections at

  5. Three-dimensional reconstruction and the phylogeny of extinct chelicerate orders

    PubMed Central

    Dunlop, Jason

    2014-01-01

    Arachnids are an important group of arthropods. They are: diverse and abundant; a major constituent of many terrestrial ecosystems; and possess a deep and extensive fossil record. In recent years a number of exceptionally preserved arachnid fossils have been investigated using tomography and associated techniques, providing valuable insights into their morphology. Here we use X-ray microtomography to reconstruct members of two extinct arachnid orders. In the Haptopoda, we demonstrate the presence of ‘clasp-knife’ chelicerae, and our novel redescription of a member of the Phalangiotarbida highlights leg details, but fails to resolve chelicerae in the group due to their small size. As a result of these reconstructions, tomographic studies of three-dimensionally preserved fossils now exist for three of the four extinct orders, and for fossil representatives of several extant ones. Such studies constitute a valuable source of high fidelity data for constructing phylogenies. To illustrate this, here we present a cladistic analysis of the chelicerates to accompany these reconstructions. This is based on a previously published matrix, expanded to include fossil taxa and relevant characters, and allows us to: cladistically place the extinct arachnid orders; explicitly test some earlier hypotheses from the literature; and demonstrate that the addition of fossils to phylogenetic analyses can have broad implications. Phylogenies based on chelicerate morphology—in contrast to molecular studies—have achieved elements of consensus in recent years. Our work suggests that these results are not robust to the addition of novel characters or fossil taxa. Hypotheses surrounding chelicerate phylogeny remain in a state of flux. PMID:25405073

  6. Well, what about intraspecific variation? Taxonomic and phylogenetic characters in the genus Synoeca de Saussure (Hymenoptera, Vespidae).

    PubMed

    Carpenter, James M; Andena, Sergio R; Noll, Fernando B; Wenzel, John W

    2013-01-01

    Cely and Sarmiento (2011) took issue with the cladistic analysis of relationships among species of the genus Synoeca by Andena et al. (2009a), and presented a reanalysis. They claimed that intraspecific variation in the genus is meaningful, and proper consideration yields a conclusion different from that of Andena et al. Both their critique and reanalysis are vitiated by numerous errors, as is shown in the present paper.

  7. EWET: Data collection and interface for the genetic analysis of Echinococcus multilocularis based on EmsB microsatellite.

    PubMed

    Knapp, Jenny; Damy, Sylvie; Brillaud, Jonathan; Tissot, Jean-Daniel; Navion, Jérémy; Mélior, Raphael; Afonso, Eve; Hormaz, Vanessa; Gottstein, Bruno; Umhang, Gérald; Casulli, Adriano; Dadeau, Frédéric; Millon, Laurence; Raoul, Francis

    2017-01-01

    Evolution and dispersion history on Earth of organisms can best be studied through biological markers in molecular epidemiological studies. The biological diversity of the cestode Echinococcus multilocularis was investigated in different cladistic approaches. First the morphological aspects were explored in connection with its ecology. More recently, molecular aspects were investigated to better understand the nature of the variations observed among isolates. The study of the tandemly repeated multilocus microsatellite EmsB allowed us to attain a high genetic diversity level where other classic markers have failed. Since 2006, EmsB data have been collected on specimens from various endemic foci of the parasite in Europe (in historic and newly endemic areas), Asia (China, Japan and Kyrgyzstan), and North America (Canada and Alaska). Biological data on the isolates and metadata were also recorded (e.g. host, geographical location, EmsB analysis, citation in the literature). In order to make available the data set of 1,166 isolates from classic and aberrant domestic and wild animal hosts (larval lesions and adult worms) and from human origin, an open web access interface, developed in PHP, and connected to a PostgreSQL database, was developed in the EmsB Website for the Echinococcus Typing (EWET) project. It allows researchers to access data collection, perform genetic analyses online (e.g. defining the genetic distance between their own samples and the samples in the database), consult distribution maps of EmsB profiles, and record and share their new EmsB genotyping data. In order to standardize the EmsB analyses performed in the different laboratories throughout the world, a calibrator was developed. The final aim of this project was to gather and arrange available data to permit to better understand the dispersion and transmission patterns of the parasite among definitive and intermediate hosts, in order to organize control strategies on the ground.

  8. EWET: Data collection and interface for the genetic analysis of Echinococcus multilocularis based on EmsB microsatellite

    PubMed Central

    Damy, Sylvie; Brillaud, Jonathan; Tissot, Jean-Daniel; Navion, Jérémy; Mélior, Raphael; Afonso, Eve; Hormaz, Vanessa; Gottstein, Bruno; Umhang, Gérald; Casulli, Adriano; Dadeau, Frédéric; Millon, Laurence; Raoul, Francis

    2017-01-01

    Evolution and dispersion history on Earth of organisms can best be studied through biological markers in molecular epidemiological studies. The biological diversity of the cestode Echinococcus multilocularis was investigated in different cladistic approaches. First the morphological aspects were explored in connection with its ecology. More recently, molecular aspects were investigated to better understand the nature of the variations observed among isolates. The study of the tandemly repeated multilocus microsatellite EmsB allowed us to attain a high genetic diversity level where other classic markers have failed. Since 2006, EmsB data have been collected on specimens from various endemic foci of the parasite in Europe (in historic and newly endemic areas), Asia (China, Japan and Kyrgyzstan), and North America (Canada and Alaska). Biological data on the isolates and metadata were also recorded (e.g. host, geographical location, EmsB analysis, citation in the literature). In order to make available the data set of 1,166 isolates from classic and aberrant domestic and wild animal hosts (larval lesions and adult worms) and from human origin, an open web access interface, developed in PHP, and connected to a PostgreSQL database, was developed in the EmsB Website for the Echinococcus Typing (EWET) project. It allows researchers to access data collection, perform genetic analyses online (e.g. defining the genetic distance between their own samples and the samples in the database), consult distribution maps of EmsB profiles, and record and share their new EmsB genotyping data. In order to standardize the EmsB analyses performed in the different laboratories throughout the world, a calibrator was developed. The final aim of this project was to gather and arrange available data to permit to better understand the dispersion and transmission patterns of the parasite among definitive and intermediate hosts, in order to organize control strategies on the ground. PMID:28972978

  9. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  10. Peptidomic analysis of the extensive array of host-defense peptides in skin secretions of the dodecaploid frog Xenopus ruwenzoriensis (Pipidae).

    PubMed

    Coquet, Laurent; Kolodziejek, Jolanta; Jouenne, Thierry; Nowotny, Norbert; King, Jay D; Conlon, J Michael

    2016-09-01

    The Uganda clawed frog Xenopus ruwenzoriensis with a karyotype of 2n=108 is one of the very few vertebrates with dodecaploid status. Peptidomic analysis of norepinephrine-stimulated skin secretions from this species led to the isolation and structural characterization of 23 host-defense peptides belonging to the following families: magainin (3 peptides), peptide glycine-leucine-amide (PGLa; 6 peptides), xenopsin precursor fragment (XPF; 3 peptides), caerulein precursor fragment (CPF; 8 peptides), and caerulein precursor fragment-related peptide (CPF-RP; 3 peptides). In addition, the secretions contained caerulein, identical to the peptide from Xenopus laevis, and two peptides that were identified as members of the trefoil factor family (TFF). The data indicate that silencing of the host-defense peptide genes following polyploidization has been appreciable and non-uniform. Consistent with data derived from comparison of nucleotide sequences of mitochrondrial and nuclear genes, cladistic analyses based upon the primary structures of the host-defense peptides provide support for an evolutionary scenario in which X. ruwenzoriensis arose from an allopolyploidization event involving an octoploid ancestor of the present-day frogs belonging to the Xenopus amieti species group and a tetraploid ancestor of Xenopus pygmaeus. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  12. Sequence analysis of serum albumins reveals the molecular evolution of ligand recognition properties.

    PubMed

    Fanali, Gabriella; Ascenzi, Paolo; Bernardi, Giorgio; Fasano, Mauro

    2012-01-01

    Serum albumin (SA) is a circulating protein providing a depot and carrier for many endogenous and exogenous compounds. At least seven major binding sites have been identified by structural and functional investigations mainly in human SA. SA is conserved in vertebrates, with at least 49 entries in protein sequence databases. The multiple sequence analysis of this set of entries leads to the definition of a cladistic tree for the molecular evolution of SA orthologs in vertebrates, thus showing the clustering of the considered species, with lamprey SAs (Lethenteron japonicum and Petromyzon marinus) in a separate outgroup. Sequence analysis aimed at searching conserved domains revealed that most SA sequences are made up by three repeated domains (about 600 residues), as extensively characterized for human SA. On the contrary, lamprey SAs are giant proteins (about 1400 residues) comprising seven repeated domains. The phylogenetic analysis of the SA family reveals a stringent correlation with the taxonomic classification of the species available in sequence databases. A focused inspection of the sequences of ligand binding sites in SA revealed that in all sites most residues involved in ligand binding are conserved, although the versatility towards different ligands could be peculiar of higher organisms. Moreover, the analysis of molecular links between the different sites suggests that allosteric modulation mechanisms could be restricted to higher vertebrates.

  13. Phylogeny of the Acanthocephala based on morphological characters.

    PubMed

    Monks, S

    2001-02-01

    Only four previous studies of relationships among acanthocephalans have included cladistic analyses, and knowledge of the phylogeny of the group has not kept pace with that of other taxa. The purpose of this study is to provide a more comprehensive analysis of the phylogenetic relationships among members of the phylum Acanthocephala using morphological characters. The most appropriate outgroups are those that share a common early cell-cleavage pattern (polar placement of centrioles), such as the Rotifera, rather than the Priapulida (meridional placement of centrioles) to provide character polarity based on common ancestry rather than a general similarity likely due to convergence of body shapes. The phylogeny of 22 species of the Acanthocephala was evaluated based on 138 binary and multistate characters derived from comparative morphological and ontogenetic studies. Three assumptions of cement gland structure were tested: (i) the plesiomorphic type of cement glands in the Rotifera, as the sister group, is undetermined; (ii) non-syncytial cement glands are plesiomorphic; and (iii) syncytial cement glands are plesiomorphic. The results were used to test an early move of Tegorhynchus pectinarius to Koronacantha and to evaluate the relationship between Tegorhynchus and Illiosentis. Analysis of the data-set for each of these assumptions of cement gland structure produced the same single most parsimonious tree topology. Using Assumptions i and ii for the cement glands, the trees were the same length (length = 404 steps, CI = 0.545, CIX = 0.517, HI = 0.455, HIX = 0.483, RI = 0.670, RC = 0.365). Using Assumption iii, the tree was three steps longer (length = 408 steps, CI = 0.539, CIX = 0.512, HI = 0.461, HIX = 0.488, RI = 0.665, RC = 0.359). The tree indicates that the Palaeacanthocephala and Eoacanthocephala both are monophyletic and are sister taxa. The members of the Archiacanthocephala are basal to the other two clades, but do not themselves form a clade. The results

  14. Statistical parsimony networks and species assemblages in Cephalotrichid nemerteans (nemertea).

    PubMed

    Chen, Haixia; Strand, Malin; Norenburg, Jon L; Sun, Shichun; Kajihara, Hiroshi; Chernyshev, Alexey V; Maslakova, Svetlana A; Sundberg, Per

    2010-09-21

    It has been suggested that statistical parsimony network analysis could be used to get an indication of species represented in a set of nucleotide data, and the approach has been used to discuss species boundaries in some taxa. Based on 635 base pairs of the mitochondrial protein-coding gene cytochrome c oxidase I (COI), we analyzed 152 nemertean specimens using statistical parsimony network analysis with the connection probability set to 95%. The analysis revealed 15 distinct networks together with seven singletons. Statistical parsimony yielded three networks supporting the species status of Cephalothrix rufifrons, C. major and C. spiralis as they currently have been delineated by morphological characters and geographical location. Many other networks contained haplotypes from nearby geographical locations. Cladistic structure by maximum likelihood analysis overall supported the network analysis, but indicated a false positive result where subnetworks should have been connected into one network/species. This probably is caused by undersampling of the intraspecific haplotype diversity. Statistical parsimony network analysis provides a rapid and useful tool for detecting possible undescribed/cryptic species among cephalotrichid nemerteans based on COI gene. It should be combined with phylogenetic analysis to get indications of false positive results, i.e., subnetworks that would have been connected with more extensive haplotype sampling.

  15. A phylogenetic and biogeographic analysis of the genera of Spirorchinae (Digenea: Spirorchidae) parasitic in freshwater turtles.

    PubMed

    Platt, T R

    1992-08-01

    Cladistic analysis of the freshwater genera of Spirorchinae (Schistosomatoidea: Spirorchidae sensu Yamaguti, 1971) plus Haematotrema Stunkard, 1923, and Aphanospirorchis Platt, 1990, was completed. The Spirorchinae were considered monophyletic based on synapomorphies of the esophagus. Three lineages, Spirhapalum (Europe/Asia), Plasmiorchis+Hemiorchis (India), and Spirorchis + Henotosoma + Haematotrema + Aphanospirorchis (North America), were identified. Nelsen consensus analysis was used as the basis for recognizing 3 valid monophyletic genera: Spirhapalum, Plasmiorchis, and Spirorchis. Hapalotrematinae sensu Smith, 1972 (e.g., Hapalorhynchus/Coeuritrema), is considered the most plesiomorphic group of spirorchids. Freshwater representatives of the hapalotrematines have been reported from 7 of 12 extant turtle families, including the relatively primitive Pelomedusidae (Pleurodira) and exhibit a worldwide distribution. It is hypothesized that this group arose in the early Triassic period, prior to the breakup of Pangea. Thus, it represents a primitive lineage that was present during the diversification of turtle lineages in the mid-Mesozoic era. Spirorchinae arose later (late Cretaceous period) as a Laurasian component parasitic in the more recent pond turtles (Emydidae + Bataguridae). Species of Spirhapalum retained a relatively plesiomorphic distribution, and they are found in emydids (Europe) and batagurids (Asia). Species of Spirorchis arose and diversified with North America emydids following the separation of North America and Europe in the late Cretaceous or early Tertiary periods. Species of Plasmiorchis are hypothesized to be derived from Asian ancestors that accompanied the colonization of India by Asian batagurids during the early Tertiary period. The presence of Spirorchis species in snapping turtles (Chelydridae/North America) and of Plasmiorchis species in Indian soft-shelled turtle (Trionychidae) are considered independent colonization events.

  16. Morphological characters are compatible with mitogenomic data in resolving the phylogeny of nymphalid butterflies (lepidoptera: papilionoidea: nymphalidae).

    PubMed

    Shi, Qing-Hui; Sun, Xiao-Yan; Wang, Yun-Liang; Hao, Jia-Sheng; Yang, Qun

    2015-01-01

    Nymphalidae is the largest family of butterflies with their phylogenetic relationships not adequately approached to date. The mitochondrial genomes (mitogenomes) of 11 new nymphalid species were reported and a comparative mitogenomic analysis was conducted together with other 22 available nymphalid mitogenomes. A phylogenetic analysis of the 33 species from all 13 currently recognized nymphalid subfamilies was done based on the mitogenomic data set with three Lycaenidae species as the outgroups. The mitogenome comparison showed that the eleven new mitogenomes were similar with those of other butterflies in gene content and order. The reconstructed phylogenetic trees reveal that the nymphalids are made up of five major clades (the nymphaline, heliconiine, satyrine, danaine and libytheine clades), with sister relationship between subfamilies Cyrestinae and Biblidinae, and most likely between subfamilies Morphinae and Satyrinae. This whole mitogenome-based phylogeny is generally congruent with those of former studies based on nuclear-gene and mitogenomic analyses, but differs considerably from the result of morphological cladistic analysis, such as the basal position of Libytheinae in morpho-phylogeny is not confirmed in molecular studies. However, we found that the mitogenomic phylogeny established herein is compatible with selected morphological characters (including developmental and adult morpho-characters).

  17. Team-Based Care: A Concept Analysis.

    PubMed

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  18. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  19. A New Morphological Phylogeny of the Ophiuroidea (Echinodermata) Accords with Molecular Evidence and Renders Microfossils Accessible for Cladistics

    PubMed Central

    Thuy, Ben; Stöhr, Sabine

    2016-01-01

    Ophiuroid systematics is currently in a state of upheaval, with recent molecular estimates fundamentally clashing with traditional, morphology-based classifications. Here, we attempt a long overdue recast of a morphological phylogeny estimate of the Ophiuroidea taking into account latest insights on microstructural features of the arm skeleton. Our final estimate is based on a total of 45 ingroup taxa, including 41 recent species covering the full range of extant ophiuroid higher taxon diversity and 4 fossil species known from exceptionally preserved material, and the Lower Carboniferous Aganaster gregarius as the outgroup. A total of 130 characters were scored directly on specimens. The tree resulting from the Bayesian inference analysis of the full data matrix is reasonably well resolved and well supported, and refutes all previous classifications, with most traditional families discredited as poly- or paraphyletic. In contrast, our tree agrees remarkably well with the latest molecular estimate, thus paving the way towards an integrated new classification of the Ophiuroidea. Among the characters which were qualitatively found to accord best with our tree topology, we selected a list of potential synapomorphies for future formal clade definitions. Furthermore, an analysis with 13 of the ingroup taxa reduced to the lateral arm plate characters produced a tree which was essentially similar to the full dataset tree. This suggests that dissociated lateral arm plates can be analysed in combination with fully known taxa and thus effectively unlocks the extensive record of fossil lateral arm plates for phylogenetic estimates. Finally, the age and position within our tree implies that the ophiuroid crown-group had started to diversify by the Early Triassic. PMID:27227685

  20. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  1. Hybridization in the section Mentha (Lamiaceae) inferred from AFLP markers.

    PubMed

    Gobert, V; Moja, S; Colson, M; Taberlet, P

    2002-12-01

    The amplified fragment length polymorphism (AFLP) method was used to evaluate genetic diversity and to assess genetic relationships within the section Mentha in order to clarify the taxonomy of several interspecific mint hybrids with molecular markers. To this end, genetic diversity of 62 Mentha accessions from different geographic origins, representing five species and three hybrids, was assessed. Three EcoRI/MseI AFLP primer combinations generated an average of 40 AFLP markers per primer combination, ranging in size from 50 to 500 base pairs (bp). The percentage of markers polymorphic ranged from 50% to 60% across all accessions studied. According to phenetic and cladistic analysis, the 62 mint accessions were grouped into two major clusters. Principal coordinates analysis separated species into well-defined groups, and clear relationships between species and hybrids could be described. Our AFLP analysis supports taxonomic classification established among Mentha species by conventional (morphological, cytological, and chemical) methods. It allows the assessment of phenetic relationships between species and the hybrids M. spicata and M. × piperita, largely cultivated all over the world for their menthol source, and provides new insights into the subdivision of M. spicata, based for the first time on molecular markers.

  2. Technological variability at Sibudu Cave: The end of Howiesons Poort and reduced mobility strategies after 62,000 years ago

    PubMed Central

    Wadley, Lyn

    2017-01-01

    We evaluate the cultural variation between the youngest Howiesons Poort layer (GR) and the oldest post-Howiesons Poort layers (RB-YA) of Sibudu Cave (KwaZulu-Natal, South Africa). We first conducted a technological analysis, secondly we performed a cladistic study with all the technological traits and, finally, we compare the technological variability with other data from Sibudu (ochre, micromorphology, fauna and plant remains). The synapomorphies of the cladistical analysis show numerous lithic technological changes between the youngest Howiesons Poort and the oldest post-Howiesons Poort layers as previously concluded. However, some technological strategies that are present, yet uncommon, in the Howiesons Poort become abundant in the overlying layers, whereas others that were fundamental to the Howiesons Poort continue, but are poorly represented in the overlying layers. We further show that lithic technological strategies appear and disappear as pulses in the post-Howiesons Poort layers studied. Among the most notable changes in the post-Howiesons Poort layers is the importance of flake production from discoidal knapping methods, the unstandardized retouched pieces and their infrequent representation, and the higher than usual frequency of grindstones. We evaluate various hypotheses to explain the transformation of a Howiesons Poort formal industry to a more ‘expedient’ assemblage. Since no marked environmental changes are contemporary with the technological transformation, a change in residential mobility patterns seems a plausible explanation. This hypothesis is supported by the changes observed in stratigraphy, lithic technology, site management, ochre and firewood collection. PMID:28982148

  3. Technological variability at Sibudu Cave: The end of Howiesons Poort and reduced mobility strategies after 62,000 years ago.

    PubMed

    de la Peña, Paloma; Wadley, Lyn

    2017-01-01

    We evaluate the cultural variation between the youngest Howiesons Poort layer (GR) and the oldest post-Howiesons Poort layers (RB-YA) of Sibudu Cave (KwaZulu-Natal, South Africa). We first conducted a technological analysis, secondly we performed a cladistic study with all the technological traits and, finally, we compare the technological variability with other data from Sibudu (ochre, micromorphology, fauna and plant remains). The synapomorphies of the cladistical analysis show numerous lithic technological changes between the youngest Howiesons Poort and the oldest post-Howiesons Poort layers as previously concluded. However, some technological strategies that are present, yet uncommon, in the Howiesons Poort become abundant in the overlying layers, whereas others that were fundamental to the Howiesons Poort continue, but are poorly represented in the overlying layers. We further show that lithic technological strategies appear and disappear as pulses in the post-Howiesons Poort layers studied. Among the most notable changes in the post-Howiesons Poort layers is the importance of flake production from discoidal knapping methods, the unstandardized retouched pieces and their infrequent representation, and the higher than usual frequency of grindstones. We evaluate various hypotheses to explain the transformation of a Howiesons Poort formal industry to a more 'expedient' assemblage. Since no marked environmental changes are contemporary with the technological transformation, a change in residential mobility patterns seems a plausible explanation. This hypothesis is supported by the changes observed in stratigraphy, lithic technology, site management, ochre and firewood collection.

  4. Morphological and Phytochemical Diversity among Hypericum Species of the Mediterranean Basin

    PubMed Central

    Nürk, Nicolai M.; Crockett, Sara L.

    2012-01-01

    The genus Hypericum L. (St. John’s wort, Hypericaceae) includes more than 450 species that occur in temperature or tropical mountain regions of the world. Monographic work on the genus has resulted in the recognition and description of 36 taxonomic sections, delineated by specific combinations of morphological characteristics and biogeographic distribution. The Mediterranean Basin has been recognized as a hot spot of diversity for the genus Hypericum, and as such is a region in which many endemic species occur. Species belonging to sections distributed in this area of the world display considerable morphological and phytochemical diversity. Results of a cladistic analysis, based on 89 morphological characters that were considered phylogenetically informative, are given here. In addition, a brief overview of morphological characteristics and the distribution of pharmaceutically relevant secondary metabolites for species native to this region of the world are presented. PMID:22662020

  5. Evidence based practice readiness: A concept analysis.

    PubMed

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  6. A laid-back trip through the Hennigian Forests

    PubMed Central

    2017-01-01

    Background This paper is a comment on the idea of matrix-free Cladistics. Demonstration of this idea’s efficiency is a major goal of the study. Within the proposed framework, the ordinary (phenetic) matrix is necessary only as “source” of Hennigian trees, not as a primary subject of the analysis. Switching from the matrix-based thinking to the matrix-free Cladistic approach clearly reveals that optimizations of the character-state changes are related not to the real processes, but to the form of the data representation. Methods We focused our study on the binary data. We wrote the simple ruby-based script FORESTER version 1.0 that helps represent a binary matrix as an array of the rooted trees (as a “Hennigian forest”). The binary representations of the genomic (DNA) data have been made by script 1001. The Average Consensus method as well as the standard Maximum Parsimony (MP) approach has been used to analyze the data. Principle findings The binary matrix may be easily re-written as a set of rooted trees (maximal relationships). The latter might be analyzed by the Average Consensus method. Paradoxically, this method, if applied to the Hennigian forests, in principle can help to identify clades despite the absence of the direct evidence from the primary data. Our approach may handle the clock- or non clock-like matrices, as well as the hypothetical, molecular or morphological data. Discussion Our proposal clearly differs from the numerous phenetic alignment-free techniques of the construction of the phylogenetic trees. Dealing with the relations, not with the actual “data” also distinguishes our approach from all optimization-based methods, if the optimization is defined as a way to reconstruct the sequences of the character-state changes on a tree, either the standard alignment-based techniques or the “direct” alignment-free procedure. We are not viewing our recent framework as an alternative to the three-taxon statement analysis (3TA), but there

  7. The Evolution of Reproduction within Testudinata as Evidenced by the Fossil Record

    NASA Astrophysics Data System (ADS)

    Lawver, Daniel Ryan

    Although known from every continent except Antarctica and having a fossil record ranging from the Middle Jurassic to the Pleistocene, fossil turtle eggs are relatively understudied. In this dissertation I describe four fossil specimens, interpret paleoecology and conduct cladistic analyses in order to investigate the evolution of turtle reproduction. Fossil eggshell descriptions primarily involve analysis by scanning electron and polarized light microscopy, as well as cathodoluminescence to determine the degree of diagenetic alteration. Carapace lengths and gas conductance are estimated in order to investigate the ecology of the adults that produced fossil turtle eggs and clutches, as well as their incubation environments, respectively. Cladistic analyses of turtle egg and reproductive characters permit assessment of the usefulness of these characters for determining phylogenetic relationships of fossil specimens and the evolution of reproduction in turtles. Specimens described here include 1) Testudoolithus oosp. from the Late Cretaceous of Madagascar, 2) a clutch of eggs (some containing late stage embryos and at least one exhibiting multilayer eggshell) from the Late Cretaceous Judith River Formation of Montana and named Testudoolithus zelenitskyae oosp. nov., 3) an egg contained within an adult Basilemys nobilis from the Late Cretaceous Kaiparowits Formation of Utah, and 4) a clutch of Meiolania platyceps eggs from the Pleistocene of Lord Howe Island, Australia. Meiolania platyceps eggs are named Testudoolithus lordhowensis oosp. nov. and provide valuable information on the origin of aragonite eggshell composition and nesting behaviors. Cladistic analyses utilizing egg and reproductive characters are rarely performed on taxa outside of Dinosauria. My analyses demonstrate that morphological data produces poorly resolved trees in which only the clades Adocia and Trionychia are resolved and all other turtles form a large polytomy. However, when combined with

  8. Engineering Analysis Using a Web-based Protocol

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.; Claus, Russell W.

    2002-01-01

    This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.

  9. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  10. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  11. A sort of revolution: Systematics and physical anthropology in the 20th century.

    PubMed

    Cartmill, Matt

    2018-04-01

    During the first four decades of the 20th century, a system of ideas about the evolution and systematics of humans and other primates coalesced around the work of George Gaylord Simpson and W. E. Le Gros Clark. Buttressed by the "new physical anthropology" of the 1950s, that system provided an authoritative model-a disciplinary matrix or paradigm-for the practice of that aspect of biological anthropology. The Simpson-Le Gros Clark synthesis began to unravel in the 1960s and collapsed in the 1970s under the onslaught of cladistic systematics. The cladistic "revolution" resembles a paradigm shift of the sort proposed by Thomas Kuhn because it was driven, not by new biological discoveries or theories, but by a change in aesthetics. © 2017 Wiley Periodicals, Inc.

  12. Wavelet-based polarimetry analysis

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Harrity, Kyle; Farag, Waleed; Alford, Mark; Ferris, David; Blasch, Erik

    2014-06-01

    Wavelet transformation has become a cutting edge and promising approach in the field of image and signal processing. A wavelet is a waveform of effectively limited duration that has an average value of zero. Wavelet analysis is done by breaking up the signal into shifted and scaled versions of the original signal. The key advantage of a wavelet is that it is capable of revealing smaller changes, trends, and breakdown points that are not revealed by other techniques such as Fourier analysis. The phenomenon of polarization has been studied for quite some time and is a very useful tool for target detection and tracking. Long Wave Infrared (LWIR) polarization is beneficial for detecting camouflaged objects and is a useful approach when identifying and distinguishing manmade objects from natural clutter. In addition, the Stokes Polarization Parameters, which are calculated from 0°, 45°, 90°, 135° right circular, and left circular intensity measurements, provide spatial orientations of target features and suppress natural features. In this paper, we propose a wavelet-based polarimetry analysis (WPA) method to analyze Long Wave Infrared Polarimetry Imagery to discriminate targets such as dismounts and vehicles from background clutter. These parameters can be used for image thresholding and segmentation. Experimental results show the wavelet-based polarimetry analysis is efficient and can be used in a wide range of applications such as change detection, shape extraction, target recognition, and feature-aided tracking.

  13. The State of Phylogenetic Analysis: Narrow Visions and Simple Answers-Examples from the Diptera (flies).

    PubMed

    Borkent, Art

    2018-01-17

    The order Diptera is remarkably diverse, not only in species but in morphological variation in every life stage, making them excellent candidates for phylogenetic analysis. Such analysis has been hampered by methods that have severely restricted character state interpretation. Morphological-based phylogenies should be based on a deep understanding of the morphology, development and function of character states, and have extensive outgroup comparisons made to determine their polarity. Character states clearly vary in their value for determining phylogenetic relationships and this needs to be studied and utilized. Characters themselves need more explicit discussion, including how some may be developmentally or functionally related to other characters (and potentially not independent indicators of genealogical relationship). The current practice by many, of filling a matrix with poorly understood character states and highly limited outgroup comparisons, is unacceptable if the results are to be a valid reflection of the actual history of the group.Parsimony analysis is not an objective interpretation of phylogenetic relationships when all characters are treated as equal in value. Exact mathematical values applied to characters are entirely arbitrary and are generally used to produce a phylogeny that the author considers as reasonable. Mathematical appraisal of a given node is similarly inconsequential because characters do not have an intrinsic mathematical value. Bremer support, for example, provides values that have no biological reality but provide the pretence of objectivity. Cladists need to focus their attention on testing the validity of each synapomorphy proposed, as the basis for all further phylogenetic interpretation, rather than the testing of differing phylogenies through various comparative programs.Current phylogenetic analyses have come to increasingly depend on DNA sequence-based characters, in spite of their tumultuous history of inconsistent results

  14. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  15. Content-based analysis of news video

    NASA Astrophysics Data System (ADS)

    Yu, Junqing; Zhou, Dongru; Liu, Huayong; Cai, Bo

    2001-09-01

    In this paper, we present a schema for content-based analysis of broadcast news video. First, we separate commercials from news using audiovisual features. Then, we automatically organize news programs into a content hierarchy at various levels of abstraction via effective integration of video, audio, and text data available from the news programs. Based on these news video structure and content analysis technologies, a TV news video Library is generated, from which users can retrieve definite news story according to their demands.

  16. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    PubMed

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the

  17. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  18. New craniodental material of Pronothodectes gaoi Fox (Mammalia, "Plesiadapiformes") and relationships among members of Plesiadapidae.

    PubMed

    Boyer, Doug M; Scott, Craig S; Fox, Richard C

    2012-04-01

    Plesiadapidae are a family of Paleogene mammals thought to have phylogenetic affinities with modern Primates. We describe previously unpublished dentitions and the first skull and isolated petrosals of the plesiadapid Pronothodectes gaoi, collected from middle Tiffanian localities of the Paskapoo Formation in Alberta. Other species of Pronothodectes, traditionally considered the most basal members of the Plesiadapidae, occur at earlier, Torrejonian horizons in Montana, Wyoming, and Alberta. Classification of P. gaoi as a species of Pronothodectes has proved controversial; accordingly, we use the newly available samples and the more extensively preserved specimens to re-evaluate the generic affinities of this species. Included in our study are comparisons with craniodental material known for other plesiadapids and plesiadapiforms. Cladistic analysis of craniodental characters is used to assess the hypothesis that P. gaoi and other species in this genus are basal members of the Plesiadapidae. The new dental evidence confirms that P. gaoi lacks derived character states of other plesiadapids except for a variably present fissuring of the m3 hypoconulid. Moreover, several aspects of the cranium seem to be more primitive in P. gaoi (i.e., more like nonplesiadapid plesiadapiforms) than in later occurring plesiadapids, such as Plesiadapis tricuspidens and Plesiadapis cookei. Cladistic analysis of craniodental morphology supports a basal position of P. gaoi among species of Plesiadapidae, with the exception of other species of Pronothodectes. The basicranium of P. gaoi preserves a laterally placed bony canal for the internal carotid neurovascular system, suggesting that this was the ancestral condition for the family. Copyright © 2012 Wiley Periodicals, Inc.

  19. CognitionMaster: an object-based image analysis framework

    PubMed Central

    2013-01-01

    Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542

  20. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  1. Space shuttle booster multi-engine base flow analysis

    NASA Technical Reports Server (NTRS)

    Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.

    1972-01-01

    A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.

  2. Base compaction specification feasibility analysis.

    DOT National Transportation Integrated Search

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  3. Comprehension-Based versus Production-Based Grammar Instruction: A Meta-Analysis of Comparative Studies

    ERIC Educational Resources Information Center

    Shintani, Natsuko; Li, Shaofeng; Ellis, Rod

    2013-01-01

    This article reports a meta-analysis of studies that investigated the relative effectiveness of comprehension-based instruction (CBI) and production-based instruction (PBI). The meta-analysis only included studies that featured a direct comparison of CBI and PBI in order to ensure methodological and statistical robustness. A total of 35 research…

  4. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  5. Chapter 11. Community analysis-based methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Y.; Wu, C.H.; Andersen, G.L.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less

  6. Market-Based Multirobot Coordination: A Survey and Analysis

    DTIC Science & Technology

    2005-04-01

    observe new information about their surroundings. Market -based approaches can often seamlessly incorporate online tasks by auctioning new tasks as they... Market -Based Multirobot Coordination: A Survey and Analysis M. Bernardine Dias, Robert Zlot, Nidhi Kalra, and Anthony Stentz CMU-RI-TR-05-13 April...00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Market -Based Multirobot Coordination: A Survey and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER

  7. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  8. Gender-Based Analysis On-Line Dialogue. Final Report.

    ERIC Educational Resources Information Center

    2001

    An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with…

  9. A methodological investigation of hominoid craniodental morphology and phylogenetics.

    PubMed

    Bjarnason, Alexander; Chamberlain, Andrew T; Lockwood, Charles A

    2011-01-01

    The evolutionary relationships of extant great apes and humans have been largely resolved by molecular studies, yet morphology-based phylogenetic analyses continue to provide conflicting results. In order to further investigate this discrepancy we present bootstrap clade support of morphological data based on two quantitative datasets, one dataset consisting of linear measurements of the whole skull from 5 hominoid genera and the second dataset consisting of 3D landmark data from the temporal bone of 5 hominoid genera, including 11 sub-species. Using similar protocols for both datasets, we were able to 1) compare distance-based phylogenetic methods to cladistic parsimony of quantitative data converted into discrete character states, 2) vary outgroup choice to observe its effect on phylogenetic inference, and 3) analyse male and female data separately to observe the effect of sexual dimorphism on phylogenies. Phylogenetic analysis was sensitive to methodological decisions, particularly outgroup selection, where designation of Pongo as an outgroup and removal of Hylobates resulted in greater congruence with the proposed molecular phylogeny. The performance of distance-based methods also justifies their use in phylogenetic analysis of morphological data. It is clear from our analyses that hominoid phylogenetics ought not to be used as an example of conflict between the morphological and molecular, but as an example of how outgroup and methodological choices can affect the outcome of phylogenetic analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. An Evidence-Based Videotaped Running Biomechanics Analysis.

    PubMed

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Graph-based layout analysis for PDF documents

    NASA Astrophysics Data System (ADS)

    Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao

    2013-03-01

    To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.

  12. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  13. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    DTIC Science & Technology

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...8217\\ \\ \\ \\ \\\\ \\ ~ >(- \\ , ~ AOC01 \\ PS018 / WP002 \\ DP008 // WP006 \\ ~ ,/ ’----- -----·-------------~--/·/ LAUGHLIN AIR FORCE BASE ENVIROMENTAL RESTORATION

  14. Taxonomic and systematic revisions to the North American Nimravidae (Mammalia, Carnivora)

    PubMed Central

    2016-01-01

    The Nimravidae is a family of extinct carnivores commonly referred to as “false saber-tooth cats.” Since their initial discovery, they have prompted difficulty in taxonomic assignments and number of valid species. Past revisions have only examined a handful of genera, while recent advances in cladistic and morphometric analyses have granted us additional avenues to answering questions regarding our understanding of valid nimravid taxa and their phylogenetic relationships. To resolve issues of specific validity, the phylogenetic species concept (PSC) was utilized to maintain consistency in diagnosing valid species, while simultaneously employing character and linear morphometric analyses for confirming the validity of taxa. Determined valid species and taxonomically informative characters were then employed in two differential cladistic analyses to create competing hypotheses of interspecific relationships. The results suggest the validity of twelve species and six monophyletic genera. The first in depth reviews of Pogonodon and Dinictis returned two valid species (P. platycopis, P. davisi) for the former, while only one for the latter (D. felina). The taxonomic validity of Nanosmilus is upheld. Two main clades with substantial support were returned for all cladistic analyses, the Hoplophoneini and Nimravini, with ambiguous positions relative to these main clades for the European taxa: Eofelis, Dinailurictis bonali, and Quercylurus major; and the North American taxa Dinictis and Pogonodon. Eusmilus is determined to represent a non-valid genus for North American taxa, suggesting non-validity for Old World nimravid species as well. Finally, Hoplophoneus mentalis is found to be a junior synonym of Hoplophoneus primaevus, while the validity of Hoplophoneus oharrai is reinstated. PMID:26893959

  15. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  16. Using Willie's Acid-Base Box for Blood Gas Analysis

    ERIC Educational Resources Information Center

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  17. Basic gait analysis based on continuous wave radar.

    PubMed

    Zhang, Jun

    2012-09-01

    A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Evolutionary Analysis of Heterochromatin Protein Compatibility by Interspecies Complementation in Saccharomyces

    PubMed Central

    Zill, Oliver A.; Scannell, Devin R.; Kuei, Jeffrey; Sadhu, Meru; Rine, Jasper

    2012-01-01

    The genetic bases for species-specific traits are widely sought, but reliable experimental methods with which to identify functionally divergent genes are lacking. In the Saccharomyces genus, interspecies complementation tests can be used to evaluate functional conservation and divergence of biological pathways or networks. Silent information regulator (SIR) proteins in S. bayanus provide an ideal test case for this approach because they show remarkable divergence in sequence and paralog number from those found in the closely related S. cerevisiae. We identified genes required for silencing in S. bayanus using a genetic screen for silencing-defective mutants. Complementation tests in interspecies hybrids identified an evolutionarily conserved Sir-protein-based silencing machinery, as defined by two interspecies complementation groups (SIR2 and SIR3). However, recessive mutations in S. bayanus SIR4 isolated from this screen could not be complemented by S. cerevisiae SIR4, revealing species-specific functional divergence in the Sir4 protein despite conservation of the overall function of the Sir2/3/4 complex. A cladistic complementation series localized the occurrence of functional changes in SIR4 to the S. cerevisiae and S. paradoxus branches of the Saccharomyces phylogeny. Most of this functional divergence mapped to sequence changes in the Sir4 PAD. Finally, a hemizygosity modifier screen in the interspecies hybrids identified additional genes involved in S. bayanus silencing. Thus, interspecies complementation tests can be used to identify (1) mutations in genetically underexplored organisms, (2) loci that have functionally diverged between species, and (3) evolutionary events of functional consequence within a genus. PMID:22923378

  19. Typing single polymorphic nucleotides in mitochondrial DNA as a way to access Middle Pleistocene DNA

    PubMed Central

    Valdiosera, Cristina; García, Nuria; Dalén, Love; Smith, Colin; Kahlke, Ralf-Dietrich; Lidén, Kerstin; Angerbjörn, Anders; Arsuaga, Juan Luis; Götherström, Anders

    2006-01-01

    In this study, we have used a technique designed to target short fragments containing informative mitochondrial substitutions to extend the temporal limits of DNA recovery and study the molecular phylogeny of Ursus deningeri. We present a cladistic analysis using DNA recovered from 400 kyr old U. deningeri remains, which demonstrates U. deningeri's relation to Ursus spelaeus. This study extends the limits of recovery from skeletal remains by almost 300 kyr. Plant material from permafrost environments has yielded DNA of this age in earlier studies, and our data suggest that DNA in teeth from cave environments may be equally well preserved. PMID:17148299

  20. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    PubMed

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  1. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  2. Teaching the Scientific Method: It's All in the Perspective

    ERIC Educational Resources Information Center

    Ayers, James M.; Ayers, Kathleen M.

    2007-01-01

    A three unit module of inquiry, including morphological comparison, cladogram construction, and data mining has been developed to teach students the nature of experimental science. Students generate angiosperm morphological data, form cladistic hypotheses, then mine taxonomic, bioinformatic and historical data from many sources to replicate and…

  3. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  4. Data depth based clustering analysis

    DOE PAGES

    Jeong, Myeong -Hun; Cai, Yaping; Sullivan, Clair J.; ...

    2016-01-01

    Here, this paper proposes a new algorithm for identifying patterns within data, based on data depth. Such a clustering analysis has an enormous potential to discover previously unknown insights from existing data sets. Many clustering algorithms already exist for this purpose. However, most algorithms are not affine invariant. Therefore, they must operate with different parameters after the data sets are rotated, scaled, or translated. Further, most clustering algorithms, based on Euclidean distance, can be sensitive to noises because they have no global perspective. Parameter selection also significantly affects the clustering results of each algorithm. Unlike many existing clustering algorithms, themore » proposed algorithm, called data depth based clustering analysis (DBCA), is able to detect coherent clusters after the data sets are affine transformed without changing a parameter. It is also robust to noises because using data depth can measure centrality and outlyingness of the underlying data. Further, it can generate relatively stable clusters by varying the parameter. The experimental comparison with the leading state-of-the-art alternatives demonstrates that the proposed algorithm outperforms DBSCAN and HDBSCAN in terms of affine invariance, and exceeds or matches the ro-bustness to noises of DBSCAN or HDBSCAN. The robust-ness to parameter selection is also demonstrated through the case study of clustering twitter data.« less

  5. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  6. Web-Based Trainer for Electrical Circuit Analysis

    ERIC Educational Resources Information Center

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  7. A dictionary based informational genome analysis

    PubMed Central

    2012-01-01

    Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068

  8. SensA: web-based sensitivity analysis of SBML models.

    PubMed

    Floettmann, Max; Uhlendorf, Jannis; Scharp, Till; Klipp, Edda; Spiesser, Thomas W

    2014-10-01

    SensA is a web-based application for sensitivity analysis of mathematical models. The sensitivity analysis is based on metabolic control analysis, computing the local, global and time-dependent properties of model components. Interactive visualization facilitates interpretation of usually complex results. SensA can contribute to the analysis, adjustment and understanding of mathematical models for dynamic systems. SensA is available at http://gofid.biologie.hu-berlin.de/ and can be used with any modern browser. The source code can be found at https://bitbucket.org/floettma/sensa/ (MIT license) © The Author 2014. Published by Oxford University Press.

  9. A Cross-Grade Study Validating the Evolutionary Pathway of Student Mental Models in Electric Circuits

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2017-01-01

    Cross-grade studies are valuable for the development of sequential curriculum. However such studies are time and resource intensive and fail to provide a clear representation to integrate different levels of representational complexity. Lin (Lin, 2006; Lin & Chiu, 2006; Lin, Chiu, & Hsu, 2006) proposed a cladistics approach in conceptual…

  10. Specialized Community-Based Care: An Evidence-Based Analysis

    PubMed Central

    2012-01-01

    Background Specialized community-based care (SCBC) refers to services that manage chronic illness through formalized links between primary and specialized care. Objectives The objectives of this evidence-based analysis (EBA) were as follows: to summarize the literature on SCBC, also known as intermediate care to synthesize the evidence from previous Medical Advisory Secretariat (now Health Quality Ontario) EBAs on SCBC for heart failure, diabetes, chronic obstructive pulmonary disease (COPD), and chronic wounds to examine the role of SCBC in family practice Results Part 1: Systematic Review of Intermediate Care Seven systematic reviews on intermediate care since 2008 were identified. The literature base is complex and difficult to define. There is evidence to suggest that intermediate care is effective in improving outcomes; however, the effective interventions are still uncertain. Part 2: Synthesis of Evidence in Intermediate Care Mortality • Heart failure Significant reduction in patients receiving SCBC • COPD Nonsignificant reduction in patients receiving SCBC Hospitalization • Heart failure Nonsignificant reduction in patients receiving SCBC • COPD Significant reduction in patients receiving SCBC Emergency Department Visits • Heart failure Nonsignificant reduction in patients receiving SCBC • COPD Significant reduction in patients receiving SCBC Disease-Specific Patient Outcomes • COPD Nonsignificant improvement in lung function in patients receiving SCBC • Diabetes Significant reduction in hemoglobin A1c (HbA1c) and systolic blood pressure in patients receiving SCBC • Chronic wounds Significant increase in the proportion of healed wounds in patients receiving SCBC Quality of Life • Heart failure Trend toward improvement in patients receiving SCBC • COPD Significant improvement in patients receiving SCBC Part 3: Intermediate Care in Family Practice—Evidence-Based Analysis Five randomized controlled trials were identified comparing SCBC

  11. Timescale analysis of rule-based biochemical reaction networks

    PubMed Central

    Klinke, David J.; Finley, Stacey D.

    2012-01-01

    The flow of information within a cell is governed by a series of protein-protein interactions that can be described as a reaction network. Mathematical models of biochemical reaction networks can be constructed by repetitively applying specific rules that define how reactants interact and what new species are formed upon reaction. To aid in understanding the underlying biochemistry, timescale analysis is one method developed to prune the size of the reaction network. In this work, we extend the methods associated with timescale analysis to reaction rules instead of the species contained within the network. To illustrate this approach, we applied timescale analysis to a simple receptor-ligand binding model and a rule-based model of Interleukin-12 (IL-12) signaling in näive CD4+ T cells. The IL-12 signaling pathway includes multiple protein-protein interactions that collectively transmit information; however, the level of mechanistic detail sufficient to capture the observed dynamics has not been justified based upon the available data. The analysis correctly predicted that reactions associated with JAK2 and TYK2 binding to their corresponding receptor exist at a pseudo-equilibrium. In contrast, reactions associated with ligand binding and receptor turnover regulate cellular response to IL-12. An empirical Bayesian approach was used to estimate the uncertainty in the timescales. This approach complements existing rank- and flux-based methods that can be used to interrogate complex reaction networks. Ultimately, timescale analysis of rule-based models is a computational tool that can be used to reveal the biochemical steps that regulate signaling dynamics. PMID:21954150

  12. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  13. The evolution of tribospheny and the antiquity of mammalian clades.

    PubMed

    Woodburne, Michael O; Rich, Thomas H; Springer, Mark S

    2003-08-01

    The evolution of tribosphenic molars is a key innovation in the history of Mammalia. Tribospheny allows for both shearing and grinding occlusal functions. Marsupials and placentals are advanced tribosphenic mammals (i.e., Theria) that show additional modifications of the tribosphenic dentition including loss of the distal metacristid and development of double-rank postvallum/prevallid shear. The recent discovery of Eomaia [Nature 416 (2002) 816], regarded as the oldest eutherian mammal, implies that the marsupial-placental split is at least 125 million years old. The conventional scenario for the evolution of tribosphenic and therian mammals hypothesizes that each group evolved once, in the northern hemisphere, and is based on a predominantly Laurasian fossil record. With the recent discovery of the oldest tribosphenic mammal (Ambondro) from the Mesozoic of Gondwana, Flynn et al. [Nature 401 (1999) 57] suggested that tribospheny evolved in Gondwana rather than in Laurasia. Luo et al. [Nature 409 (2001) 53; Acta Palaeontol. Pol. 47 (2002) 1] argued for independent origins of tribospheny in northern (Boreosphenida) and southern (Australosphenida) hemisphere clades, with the latter including Ambondro, ausktribosphenids, and monotremes. Here, we present cladistic evidence for a single origin of tribosphenic molars. Further, Ambondro may be a stem eutherian, making the split between marsupials and placentals at least 167 m.y. old. To test this hypothesis, we used the relaxed molecular clock approach of Thorne/Kishino with amino acid data sets for BRCA1 [J. Mammal. Evol. 8 (2001) 239] and the IGF2 receptor [Mammal. Genome 12 (2001) 513]. Point estimates for the marsupial-placental split were 182-190 million years based on BRCA1 and 185-187 million years based on the IGF2 receptor. These estimates are fully compatible with the results of our cladistic analyses.

  14. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  15. Data-Flow Based Model Analysis

    NASA Technical Reports Server (NTRS)

    Saad, Christian; Bauer, Bernhard

    2010-01-01

    The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.

  16. A reassessment of the referral of an isolated skull from the latecretaceous of Uzbekistan to the stem-testudinoid turtlelindholmemys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danilov, Igor G.; Parham, James F.

    2005-12-01

    A fossil turtle skull (ZISP PH 1/17) from the LateCretaceous (upper Turonian, Bissekty Formation) of Dzharakuduk(Uzbekistan, Asia) was used to score the skull characters for the genusLindholmemys (a stem testudinoid) in a recent phylogenetic analysis. Adescription of ZISP PH 1/17 and a new cladistic analysis reveals nocharacters to support its referral to Lindholmemys elegans or to thestem-testudinoid lineage. ZISP PH 1/17 is very similar to North AmericanAdocus, differing mainly in characters of the upper jaw. Therefore, wehypothesize that ZISP PH 1/17 is the skull of either Shachemysancestralis or "Adocus" aksary, adocid shell taxa from Dzharakuduk.Pending additional discoveries and description ofmore » turtles fromDzharakuduk, we refer ZISP PH 1/17 to Adocidae, gen. et sp.indet.« less

  17. Preprocessing and Analysis of LC-MS-Based Proteomic Data

    PubMed Central

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W.

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed. PMID:26519169

  18. Spectral Entropy Based Neuronal Network Synchronization Analysis Based on Microelectrode Array Measurements

    PubMed Central

    Kapucu, Fikret E.; Välkki, Inkeri; Mikkonen, Jarno E.; Leone, Chiara; Lenk, Kerstin; Tanskanen, Jarno M. A.; Hyttinen, Jari A. K.

    2016-01-01

    Synchrony and asynchrony are essential aspects of the functioning of interconnected neuronal cells and networks. New information on neuronal synchronization can be expected to aid in understanding these systems. Synchronization provides insight in the functional connectivity and the spatial distribution of the information processing in the networks. Synchronization is generally studied with time domain analysis of neuronal events, or using direct frequency spectrum analysis, e.g., in specific frequency bands. However, these methods have their pitfalls. Thus, we have previously proposed a method to analyze temporal changes in the complexity of the frequency of signals originating from different network regions. The method is based on the correlation of time varying spectral entropies (SEs). SE assesses the regularity, or complexity, of a time series by quantifying the uniformity of the frequency spectrum distribution. It has been previously employed, e.g., in electroencephalogram analysis. Here, we revisit our correlated spectral entropy method (CorSE), providing evidence of its justification, usability, and benefits. Here, CorSE is assessed with simulations and in vitro microelectrode array (MEA) data. CorSE is first demonstrated with a specifically tailored toy simulation to illustrate how it can identify synchronized populations. To provide a form of validation, the method was tested with simulated data from integrate-and-fire model based computational neuronal networks. To demonstrate the analysis of real data, CorSE was applied on in vitro MEA data measured from rat cortical cell cultures, and the results were compared with three known event based synchronization measures. Finally, we show the usability by tracking the development of networks in dissociated mouse cortical cell cultures. The results show that temporal correlations in frequency spectrum distributions reflect the network relations of neuronal populations. In the simulated data, CorSE unraveled the

  19. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    PubMed

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier

  20. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the

  1. Point-based and model-based geolocation analysis of airborne laser scanning data

    NASA Astrophysics Data System (ADS)

    Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet

    2017-01-01

    Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.

  2. Analysis and design of algorithm-based fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. Sukumaran

    1990-01-01

    An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.

  3. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  4. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  5. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  6. Soft-part anatomy of the Early Cambrian bivalved arthropods Kunyangella and Kunmingella: significance for the phylogenetic relationships of Bradoriida

    PubMed Central

    Hou, Xianguang; Williams, Mark; Siveter, David J.; Siveter, Derek J.; Aldridge, Richard J.; Sansom, Robert S.

    2010-01-01

    Bradoriids are small bivalved marine arthropods that are widespread in rocks of Cambrian to Early Ordovician age. They comprise seven families and about 70 genera based on shield (‘carapace’) morphology. New bradoriid specimens with preserved soft-part anatomy of Kunmingella douvillei (Kunmingellidae) are reported from the Early Cambrian Chengjiang Lagerstätte of China together with, for the first time to our knowledge, a second bradoriid species with preserved soft parts, Kunyangella cheni (Comptalutidae). Kunmingella douvillei has a 10-segmented limb-bearing body with uniramous ninth and tenth appendages and a series of homogeneous, apparently (proximal parts not preserved) unspecialized post-antennal biramous limbs with setose leaf-shaped exopods. Each endopod consists of five podomeres. A presumed penultimate instar of Ky. cheni preserves remnants of three head and two trunk appendages, and the adult is reconstructed as having four head appendages. This material allows testing of the affinity of the Bradoriida. Kunmingella is identified as a stem crustacean in character-based analyses, through both morphological comparisons and cladistic reconstructions. Global parsimony analysis recovers a monophyletic Bradoriida as the sister group to crown crustaceans. PMID:20181565

  7. [Big data analysis and evidence-based medicine: controversy or cooperation].

    PubMed

    Chen, Xinzu; Hu, Jiankun

    2016-01-01

    The development of evidence-based medicince should be an important milestone from the empirical medicine to the evidence-driving modern medicine. With the outbreak in biomedical data, the rising big data analysis can efficiently solve exploratory questions or decision-making issues in biomedicine and healthcare activities. The current problem in China is that big data analysis is still not well conducted and applied to deal with problems such as clinical decision-making, public health policy, and should not be a debate whether big data analysis can replace evidence-based medicine or not. Therefore, we should clearly understand, no matter whether evidence-based medicine or big data analysis, the most critical infrastructure must be the substantial work in the design, constructure and collection of original database in China.

  8. Ancestor–descendant relationships in evolution: origin of the extant pygmy right whale, Caperea marginata

    PubMed Central

    Tsai, Cheng-Hsiu; Fordyce, R. Ewan

    2015-01-01

    Ancestor–descendant relationships (ADRs), involving descent with modification, are the fundamental concept in evolution, but are usually difficult to recognize. We examined the cladistic relationship between the only reported fossil pygmy right whale, †Miocaperea pulchra, and its sole living relative, the enigmatic pygmy right whale Caperea marginata, the latter represented by both adult and juvenile specimens. †Miocaperea is phylogenetically bracketed between juvenile and adult Caperea marginata in morphologically based analyses, thus suggesting a possible ADR—the first so far identified within baleen whales (Cetacea: Mysticeti). The †Miocaperea–Caperea lineage may show long-term morphological stasis and, in turn, punctuated equilibrium. PMID:25589485

  9. Analysis of space shuttle main engine data using Beacon-based exception analysis for multi-missions

    NASA Technical Reports Server (NTRS)

    Park, H.; Mackey, R.; James, M.; Zak, M.; Kynard, M.; Sebghati, J.; Greene, W.

    2002-01-01

    This paper describes analysis of the Space Shuttle Main Engine (SSME) sensor data using Beacon-based exception analysis for multimissions (BEAM), a new technology developed for sensor analysis and diagnostics in autonomous space systems by the Jet Propulsion Laboratory (JPL).

  10. Using the DOE Knowledge Base for Special Event Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled

  11. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  12. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  13. Teaching the process of molecular phylogeny and systematics: a multi-part inquiry-based exercise.

    PubMed

    Lents, Nathan H; Cifuentes, Oscar E; Carpi, Anthony

    2010-01-01

    Three approaches to molecular phylogenetics are demonstrated to biology students as they explore molecular data from Homo sapiens and four related primates. By analyzing DNA sequences, protein sequences, and chromosomal maps, students are repeatedly challenged to develop hypotheses regarding the ancestry of the five species. Although these exercises were designed to supplement and enhance classroom instruction on phylogeny, cladistics, and systematics in the context of a postsecondary majors-level introductory biology course, the activities themselves require very little prior student exposure to these topics. Thus, they are well suited for students in a wide range of educational levels, including a biology class at the secondary level. In implementing this exercise, we have observed measurable gains, both in student comprehension of molecular phylogeny and in their acceptance of modern evolutionary theory. By engaging students in modern phylogenetic activities, these students better understood how biologists are currently using molecular data to develop a more complete picture of the shared ancestry of all living things.

  14. Graph-based urban scene analysis using symbolic data

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-07-01

    A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.

  15. ANALYSIS/PLOT: a graphics package for use with the SORT/ANALYSIS data bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sady, C.A.

    1983-08-01

    This report describes a graphics package that is used with the SORT/ANALYSIS data bases. The data listed by the SORT/ANALYSIS program can be presented in pie, bar, line, or Gantt chart form. Instructions for the use of the plotting program and descriptions of the subroutines are given in the report.

  16. Identification and human condition analysis based on the human voice analysis

    NASA Astrophysics Data System (ADS)

    Mieshkov, Oleksandr Yu.; Novikov, Oleksandr O.; Novikov, Vsevolod O.; Fainzilberg, Leonid S.; Kotyra, Andrzej; Smailova, Saule; Kozbekova, Ainur; Imanbek, Baglan

    2017-08-01

    The paper presents a two-stage biotechnical system for human condition analysis that is based on analysis of human voice signal. At the initial stage, the voice signal is pre-processed and its characteristics in time domain are determined. At the first stage, the developed system is capable of identifying the person in the database on the basis of the extracted characteristics. At the second stage, the model of a human voice is built on the basis of the real voice signals after clustering the whole database.

  17. Success Probability Analysis for Shuttle Based Microgravity Experiments

    NASA Technical Reports Server (NTRS)

    Liou, Ying-Hsin Andrew

    1996-01-01

    Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.

  18. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    PubMed

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  19. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  20. Watershed-based Morphometric Analysis: A Review

    NASA Astrophysics Data System (ADS)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  1. Heating Analysis in Constant-pressure Hydraulic System based on Energy Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Chao; Xu, Cong; Mao, Xuyao; Li, Bin; Hu, Junhua; Liu, Yiou

    2017-12-01

    Hydraulic systems are widely used in industrial applications, but the problem of heating has become an important reason to restrict the promotion of hydraulic technology. The high temperature, will seriously affect the operation of the hydraulic system, even cause stuck and other serious failure. Based on the analysis of the heat damage of the hydraulic system, this paper gives the reasons for this problem, and it is showed by the application that the energy analysis can accurately locate the main reasons for the heating of the hydraulic system, which can give strong practical guidance.

  2. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  3. The Oldest Jurassic Dinosaur: A Basal Neotheropod from the Hettangian of Great Britain.

    PubMed

    Martill, David M; Vidovic, Steven U; Howells, Cindy; Nudds, John R

    2016-01-01

    Approximately 40% of a skeleton including cranial and postcranial remains representing a new genus and species of basal neotheropod dinosaur is described. It was collected from fallen blocks from a sea cliff that exposes Late Triassic and Early Jurassic marine and quasi marine strata on the south Wales coast near the city of Cardiff. Matrix comparisons indicate that the specimen is from the lithological Jurassic part of the sequence, below the first occurrence of the index ammonite Psiloceras planorbis and above the last occurrence of the Rhaetian conodont Chirodella verecunda. Associated fauna of echinoderms and bivalves indicate that the specimen had drifted out to sea, presumably from the nearby Welsh Massif and associated islands (St David's Archipelago). Its occurrence close to the base of the Blue Lias Formation (Lower Jurassic, Hettangian) makes it the oldest known Jurassic dinosaur and it represents the first dinosaur skeleton from the Jurassic of Wales. A cladistic analysis indicates basal neotheropodan affinities, but the specimen retains plesiomorphic characters which it shares with Tawa and Daemonosaurus.

  4. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    EPA Science Inventory

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...

  6. GaitaBase: Web-based repository system for gait analysis.

    PubMed

    Tirosh, Oren; Baker, Richard; McGinley, Jenny

    2010-02-01

    The need to share gait analysis data to improve clinical decision support has been recognised since the early 1990s. GaitaBase has been established to provide a web-accessible repository system of gait analysis data to improve the sharing of data across local and international clinical and research community. It is used by several clinical and research groups across the world providing cross-group access permissions to retrieve and analyse the data. The system is useful for bench-marking and quality assurance, clinical consultation, and collaborative research. It has the capacity to increase the population sample size and improve the quality of 'normative' gait data. In addition the accumulated stored data may facilitate clinicians in comparing their own gait data with others, and give a valuable insight into how effective specific interventions have been for others. 2009 Elsevier Ltd. All rights reserved.

  7. Impact of model-based risk analysis for liver surgery planning.

    PubMed

    Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K

    2014-05-01

    A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.

  8. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  9. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    NASA Astrophysics Data System (ADS)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  10. A new metaphor for projection-based visual analysis and data exploration

    NASA Astrophysics Data System (ADS)

    Schreck, Tobias; Panse, Christian

    2007-01-01

    In many important application domains such as Business and Finance, Process Monitoring, and Security, huge and quickly increasing volumes of complex data are collected. Strong efforts are underway developing automatic and interactive analysis tools for mining useful information from these data repositories. Many data analysis algorithms require an appropriate definition of similarity (or distance) between data instances to allow meaningful clustering, classification, and retrieval, among other analysis tasks. Projection-based data visualization is highly interesting (a) for visual discrimination analysis of a data set within a given similarity definition, and (b) for comparative analysis of similarity characteristics of a given data set represented by different similarity definitions. We introduce an intuitive and effective novel approach for projection-based similarity visualization for interactive discrimination analysis, data exploration, and visual evaluation of metric space effectiveness. The approach is based on the convex hull metaphor for visually aggregating sets of points in projected space, and it can be used with a variety of different projection techniques. The effectiveness of the approach is demonstrated by application on two well-known data sets. Statistical evidence supporting the validity of the hull metaphor is presented. We advocate the hull-based approach over the standard symbol-based approach to projection visualization, as it allows a more effective perception of similarity relationships and class distribution characteristics.

  11. Model-based gene set analysis for Bioconductor.

    PubMed

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  12. Setting Standards for Medically-Based Running Analysis

    PubMed Central

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  13. Feasibility and demonstration of a cloud-based RIID analysis system

    NASA Astrophysics Data System (ADS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  14. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  15. CASAS: Cancer Survival Analysis Suite, a web based application.

    PubMed

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  16. A robust approach for ECG-based analysis of cardiopulmonary coupling.

    PubMed

    Zheng, Jiewen; Wang, Weidong; Zhang, Zhengbo; Wu, Dalei; Wu, Hao; Peng, Chung-Kang

    2016-07-01

    Deriving respiratory signal from a surface electrocardiogram (ECG) measurement has advantage of simultaneously monitoring of cardiac and respiratory activities. ECG-based cardiopulmonary coupling (CPC) analysis estimated by heart period variability and ECG-derived respiration (EDR) shows promising applications in medical field. The aim of this paper is to provide a quantitative analysis of the ECG-based CPC, and further improve its performance. Two conventional strategies were tested to obtain EDR signal: R-S wave amplitude and area of the QRS complex. An adaptive filter was utilized to extract the common component of inter-beat interval (RRI) and EDR, generating enhanced versions of EDR signal. CPC is assessed through probing the nonlinear phase interactions between RRI series and respiratory signal. Respiratory oscillations presented in both RRI series and respiratory signals were extracted by ensemble empirical mode decomposition for coupling analysis via phase synchronization index. The results demonstrated that CPC estimated from conventional EDR series exhibits constant and proportional biases, while that estimated from enhanced EDR series is more reliable. Adaptive filtering can improve the accuracy of the ECG-based CPC estimation significantly and achieve robust CPC analysis. The improved ECG-based CPC estimation may provide additional prognostic information for both sleep medicine and autonomic function analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Tertiary structure-based analysis of microRNA–target interactions

    PubMed Central

    Gan, Hin Hark; Gunsalus, Kristin C.

    2013-01-01

    Current computational analysis of microRNA interactions is based largely on primary and secondary structure analysis. Computationally efficient tertiary structure-based methods are needed to enable more realistic modeling of the molecular interactions underlying miRNA-mediated translational repression. We incorporate algorithms for predicting duplex RNA structures, ionic strength effects, duplex entropy and free energy, and docking of duplex–Argonaute protein complexes into a pipeline to model and predict miRNA–target duplex binding energies. To ensure modeling accuracy and computational efficiency, we use an all-atom description of RNA and a continuum description of ionic interactions using the Poisson–Boltzmann equation. Our method predicts the conformations of two constructs of Caenorhabditis elegans let-7 miRNA–target duplexes to an accuracy of ∼3.8 Å root mean square distance of their NMR structures. We also show that the computed duplex formation enthalpies, entropies, and free energies for eight miRNA–target duplexes agree with titration calorimetry data. Analysis of duplex–Argonaute docking shows that structural distortions arising from single-base-pair mismatches in the seed region influence the activity of the complex by destabilizing both duplex hybridization and its association with Argonaute. Collectively, these results demonstrate that tertiary structure-based modeling of miRNA interactions can reveal structural mechanisms not accessible with current secondary structure-based methods. PMID:23417009

  18. Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.; Mavris, Dimitri N.

    2006-01-01

    An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.

  19. iSeq: Web-Based RNA-seq Data Analysis and Visualization.

    PubMed

    Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng

    2018-01-01

    Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .

  20. Analysis of Aerospike Plume Induced Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1998-01-01

    Computational analysis is conducted to study the effect of an aerospike engine plume on X-33 base-heating environment during ascent flight. To properly account for the effect of forebody and aftbody flowfield such as shocks and to allow for potential plume-induced flow-separation, thermo-flowfield of trajectory points is computed. The computational methodology is based on a three-dimensional finite-difference, viscous flow, chemically reacting, pressure-base computational fluid dynamics formulation, and a three-dimensional, finite-volume, spectral-line based weighted-sum-of-gray-gases radiation absorption model computational heat transfer formulation. The predicted convective and radiative base-heat fluxes are presented.

  1. Automated image-based phenotypic analysis in zebrafish embryos

    PubMed Central

    Vogt, Andreas; Cholewinski, Andrzej; Shen, Xiaoqiang; Nelson, Scott; Lazo, John S.; Tsang, Michael; Hukriede, Neil A.

    2009-01-01

    Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to utilizing the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. PMID:19235725

  2. CASAS: Cancer Survival Analysis Suite, a web based application

    PubMed Central

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946

  3. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  4. Integrated Formulation of Beacon-Based Exception Analysis for Multimissions

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail

    2003-01-01

    Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,

  5. Arterial stiffness estimation based photoplethysmographic pulse wave analysis

    NASA Astrophysics Data System (ADS)

    Huotari, Matti; Maatta, Kari; Kostamovaara, Juha

    2010-11-01

    Arterial stiffness is one of the indices of vascular healthiness. It is based on pulse wave analysis. In the case we decompose the pulse waveform for the estimation and determination of arterial elasticity. Firstly, optically measured with photoplethysmograph and then investigating means by four lognormal pulse waveforms for which we can find very good fit between the original and summed decomposed pulse wave. Several studies have demonstrated that these kinds of measures predict cardiovascular events. While dynamic factors, e.g., arterial stiffness, depend on fixed structural features of the vascular wall. Arterial stiffness is estimated based on pulse wave decomposition analysis in the radial and tibial arteries. Elucidation of the precise relationship between endothelial function and vascular stiffness awaits still further study.

  6. Ancestor-descendant relationships in evolution: origin of the extant pygmy right whale, Caperea marginata.

    PubMed

    Tsai, Cheng-Hsiu; Fordyce, R Ewan

    2015-01-01

    Ancestor-descendant relationships (ADRs), involving descent with modification, are the fundamental concept in evolution, but are usually difficult to recognize. We examined the cladistic relationship between the only reported fossil pygmy right whale, †Miocaperea pulchra, and its sole living relative, the enigmatic pygmy right whale Caperea marginata, the latter represented by both adult and juvenile specimens. †Miocaperea is phylogenetically bracketed between juvenile and adult Caperea marginata in morphologically based analyses, thus suggesting a possible ADR-the first so far identified within baleen whales (Cetacea: Mysticeti). The †Miocaperea-Caperea lineage may show long-term morphological stasis and, in turn, punctuated equilibrium. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  7. Centre of pressure patterns in the golf swing: individual-based analysis.

    PubMed

    Ball, Kevin; Best, Russell

    2012-06-01

    Weight transfer has been identified as important in group-based analyses. The aim of this study was to extend this work by examining the importance of weight transfer in the golf swing on an individual basis. Five professional and amateur golfers performed 50 swings with the driver, hitting a ball into a net. The golfer's centre of pressure position and velocity, parallel with the line of shot, were measured by two force plates at eight swing events that were identified from high-speed video. The relationships between these parameters and club head velocity at ball contact were examined using regression statistics. The results did support the use of group-based analysis, with all golfers returning significant relationships. However, results were also individual-specific, with golfers returning different combinations of significant factors. Furthermore, factors not identified in group-based analysis were significant on an individual basis. The most consistent relationship was a larger weight transfer range associated with a larger club head velocity (p < 0.05). All golfers also returned at least one significant relationship with rate of weight transfer at swing events (p < 0.01). Individual-based analysis should form part of performance-based biomechanical analysis of sporting skills.

  8. Cluster-based exposure variation analysis

    PubMed Central

    2013-01-01

    Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate

  9. Atolchelys lepida, a new side-necked turtle from the Early Cretaceous of Brazil and the age of crown Pleurodira

    PubMed Central

    Romano, Pedro S. R.; Gallo, Valéria; Ramos, Renato R. C.; Antonioli, Luzia

    2014-01-01

    We report a new pleurodiran turtle from the Barremian Morro do Chaves Formation, Sergipe-Alagoas Basin, Brazil. We tested the phylogenetic position of Atolchelys lepida gen. et sp. nov. by including it in a comprehensive cladistic analysis of pleurodires. The new species is a basal member of Bothremydidae and simultaneously the oldest unambiguous crown Pleurodira. The biogeographic and chronostratigraphic significance of the finding has implications for the calibration of molecular clocks studies by pushing back the minimum age of crown Pleurodira by more than 12 Ma (ca 125 Ma). The reanalysis of Pelomedusoides relationships provides evidence that the early evolution and relationships among the main lineages of side-necked turtles can be explained, at least partially, by a sequence of vicariance events. PMID:25079494

  10. Heading in the right direction: thermodynamics-based network analysis and pathway engineering.

    PubMed

    Ataman, Meric; Hatzimanikatis, Vassily

    2015-12-01

    Thermodynamics-based network analysis through the introduction of thermodynamic constraints in metabolic models allows a deeper analysis of metabolism and guides pathway engineering. The number and the areas of applications of thermodynamics-based network analysis methods have been increasing in the last ten years. We review recent applications of these methods and we identify the areas that such analysis can contribute significantly, and the needs for future developments. We find that organisms with multiple compartments and extremophiles present challenges for modeling and thermodynamics-based flux analysis. The evolution of current and new methods must also address the issues of the multiple alternatives in flux directionalities and the uncertainties and partial information from analytical methods. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Phylogeny of Valerianaceae based on matK and ITS markers, with reference to matK individual polymorphism

    PubMed Central

    HIDALGO, ORIANE; GARNATJE, TERESA; SUSANNA, ALFONSO; MATHEZ, JOËL

    2004-01-01

    • Background and Aims The monophyly of Valerianaceae and the precise delimitation of the family are not totally resolved. Our knowledge on the phylogeny of the group is only partial: on a morphological basis, some contradicting taxonomic proposals have been published, which demonstrates the difficulties in establishing a natural classification of the family and especially in proposing a relevant treatment of the large genus Valeriana. The aims of this study are to contribute to the phylogeny and generic delineation of the Valerianaceae on the basis of molecular data. • Methods A cladistic analysis of the sequences of one plastid (matK) and one nuclear (ITS) molecular marker was carried out, both individually and in combination. • Key Results The results of the analyses of both regions confirm that the family is monophyletic, with the exclusion of Triplostegia. The tribe Patrinieae is monophyletic, and the tribe Valerianeae is also a natural group. Two of the subtribes of Valerianeae, Fediinae and Centranthinae, are also monophyletic, with the exclusion of the genus Plectritis from Fediinae. The subtribe Valerianinae, on the other hand, is paraphyletic. • Conclusions Our results confirm, for the first time on a molecular basis, the suggested paraphyly of Valeriana in its present circumscription, with profound nomenclatural and taxonomic implications. The correlation between molecular phylogeny and biogeography is close. In the course of the plastid DNA sequencing, a polymorphism concerning the matK gene was found, a fact that should be carefully evaluated in phylogenetic analyses. PMID:14988097

  12. Content Analysis of a Computer-Based Faculty Activity Repository

    ERIC Educational Resources Information Center

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  13. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  14. Advancing School-Based Interventions through Economic Analysis

    ERIC Educational Resources Information Center

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  15. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  16. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  17. Analysis of Load Stress for Asphalt Pavement of Lean Concrete Base

    NASA Astrophysics Data System (ADS)

    Lijun, Suo; Xinwu, Wang

    The study revealed that whether it is early distresses in asphalt pavement or not depends largely on working performance of base. In the field of asphalt pavement, it is widely accepted that lean concrete base, compared with the general semi-rigid base, has better working performance, such as high strength and good eroding resistance. Problem of early distresses in asphalt pavement, which caused by more traffic loadings, can be settled effectively when lean concrete is used in asphalt pavement. Traffic loading is important parameter used in the analysis of the new pavement design. However, few studies have done extensive and intensive research on the load stress for asphalt pavement of lean concrete base. Because of that, it is necessary to study the load stress for the asphalt pavement. In the paper, first of all, three-dimension finite element model of the asphalt pavement is created for the aim of doing mechanical analysis for the asphalt pavement. And then, the two main objectives of this study are investigated. One is analysis for load stress of lean concrete base, and the other is analysis for load stress of asphalt surface. The results show that load stress of lean concrete base decreases, decrease and increase with increase of base's thickness, surface's thickness and ratio of base's modulus to foundation's modulus respectively. So far as the asphalt surface is concerned, maximum shearing stress, which is caused by load, is evident in asphalt surface which is located in transverse contraction joint of lean concrete base of asphalt pavement. Maximum shearing stress decrease, decrease, decrease and increase respectively with increase of the surface's modulus, the surface's thickness, base's thickness and ratio of base's modulus to foundation's modulus.

  18. Data Base Reexamination as Part of IDS Secondary Analysis.

    ERIC Educational Resources Information Center

    Curry, Blair H.; And Others

    Data reexamination is a critical component for any study. The complexity of the study, the time available for data base development and analysis, and the relationship of the study to educational policy-making can all increase the criticality of such reexamination. Analysis of the error levels in the National Institute of Education's Instructional…

  19. Laser-Based Lighting: Experimental Analysis and Perspectives

    PubMed Central

    Yushchenko, Maksym; Buffolo, Matteo; Meneghini, Matteo; Zanoni, Enrico

    2017-01-01

    This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time) we present a detailed comparison between three different solutions for laser lighting, based on (i) transmissive phosphor layers; (ii) a reflective/angled phosphor layer; and (iii) a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes. PMID:29019958

  20. Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-10-01

    Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.

  1. Breath analysis based on micropreconcentrator for early cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Seok

    2018-02-01

    We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.

  2. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  3. Least-dependent-component analysis based on mutual information

    NASA Astrophysics Data System (ADS)

    Stögbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-12-01

    We propose to use precise estimators of mutual information (MI) to find the least dependent components in a linearly mixed signal. On the one hand, this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand, it has the advantage, compared to other implementations of “independent” component analysis (ICA), some of which are based on crude approximations for MI, that the numerical values of the MI can be used for (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output by comparing the pairwise MIs with those of remixed components; and (iii) clustering the output according to the residual interdependencies. For the MI estimator, we use a recently proposed k -nearest-neighbor-based algorithm. For time sequences, we combine this with delay embedding, in order to take into account nontrivial time correlations. After several tests with artificial data, we apply the resulting MILCA (mutual-information-based least dependent component analysis) algorithm to a real-world dataset, the ECG of a pregnant woman.

  4. [Concept analysis "Competency-based education"].

    PubMed

    Loosli, Clarence

    2016-03-01

    Competency-based education (CBE) stands out at global level as the best educational practice. Indeed, CBE is supposed to improve the quality of care provided by newly graduated nurses. Yet, there is a dearth of knowledge in nursing literature regarding CBE concept's definition. CBE is implemented differently in each entity even inside the same discipline in a single country. What accounts for CBE in nursing education ? to clarify CBE concept meaning according to literature review in order to propose a definition. Wilson concept analysis method framed our literature review through two databases: CINHAL and ERIC. following the 11 Wilson techniques analysis, we identified CBE concept as a multidimensional concept clustering three dimensions : learning, teaching and assessment. nurses educators are accountable for providing performants newly graduated professional to the society. Schools should struggle for the visibility and the transparency of means they are using to accomplish their educational activities. This first attempt to understand CBE concept opens a matter of debate concerning further development and clarification of the concept. This first description of CBE concept is a step toward its identification and assessment.

  5. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign

  6. Web-Based Virtual Laboratory for Food Analysis Course

    NASA Astrophysics Data System (ADS)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  7. Nondeducibility-Based Analysis of Cyber-Physical Systems

    NASA Astrophysics Data System (ADS)

    Gamage, Thoshitha; McMillin, Bruce

    Controlling information flow in a cyber-physical system (CPS) is challenging because cyber domain decisions and actions manifest themselves as visible changes in the physical domain. This paper presents a nondeducibility-based observability analysis for CPSs. In many CPSs, the capacity of a low-level (LL) observer to deduce high-level (HL) actions ranges from limited to none. However, a collaborative set of observers strategically located in a network may be able to deduce all the HL actions. This paper models a distributed power electronics control device network using a simple DC circuit in order to understand the effect of multiple observers in a CPS. The analysis reveals that the number of observers required to deduce all the HL actions in a system increases linearly with the number of configurable units. A simple definition of nondeducibility based on the uniqueness of low-level projections is also presented. This definition is used to show that a system with two security domain levels could be considered “nondeducibility secure” if no unique LL projections exist.

  8. Hybrid diagnostic system: beacon-based exception analysis for multimissions - Livingstone integration

    NASA Technical Reports Server (NTRS)

    Park, Han G.; Cannon, Howard; Bajwa, Anupa; Mackey, Ryan; James, Mark; Maul, William

    2004-01-01

    This paper describes the initial integration of a hybrid reasoning system utilizing a continuous domain feature-based detector, Beacon-based Exceptions Analysis for Multimissions (BEAM), and a discrete domain model-based reasoner, Livingstone.

  9. a Buffer Analysis Based on Co-Location Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, S.; Wang, H.; Zhang, R.; Wang, Q.; Sha, H.; Liu, X.; Pan, Q.

    2018-05-01

    Buffer analysis is a common tool of spatial analysis, which deals with the problem of proximity in GIS. Buffer analysis researches the relationship between the center object and other objects around a certain distance. Buffer analysis can make the complicated problem be more scientifically and visually, and provide valuable information for users. Over the past decades, people have done a lot of researches on buffer analysis. Along with the constantly improvement of spatial analysis accuracy needed by people, people hope that the results of spatial analysis can be more exactly express the actual situation. Due to the influence of some certain factors, the impact scope and contact range of a geographic elements on the surrounding objects are uncertain. As all we know, each object has its own characteristics and changing rules in the nature. They are both independent and relative to each other. However, almost all the generational algorithms of existing buffer analysis are based on fixed buffer distance, which do not consider the co-location relationship among instances. Consequently, it is a waste of resource to retrieve the useless information, and useful information is ignored.

  10. Community-based care for the specialized management of heart failure: an evidence-based analysis.

    PubMed

    2009-01-01

    In August 2008, the Medical Advisory Secretariat (MAS) presented a vignette to the Ontario Health Technology Advisory Committee (OHTAC) on a proposed targeted health care delivery model for chronic care. The proposed model was defined as multidisciplinary, ambulatory, community-based care that bridged the gap between primary and tertiary care, and was intended for individuals with a chronic disease who were at risk of a hospital admission or emergency department visit. The goals of this care model were thought to include: the prevention of emergency department visits, a reduction in hospital admissions and re-admissions, facilitation of earlier hospital discharge, a reduction or delay in long-term care admissions, and an improvement in mortality and other disease-specific patient outcomes.OHTAC approved the development of an evidence-based assessment to determine the effectiveness of specialized community based care for the management of heart failure, Type 2 diabetes and chronic wounds.PLEASE VISIT THE MEDICAL ADVISORY SECRETARIAT WEB SITE AT: www.health.gov.on.ca/ohtas to review the following reports associated with the Specialized Multidisciplinary Community-Based care series.Specialized multidisciplinary community-based care series: a summary of evidence-based analysesCommunity-based care for the specialized management of heart failure: an evidence-based analysisCommunity-based care for chronic wound management: an evidence-based analysisPlease note that the evidence-based analysis of specialized community-based care for the management of diabetes titled: "Community-based care for the management of type 2 diabetes: an evidence-based analysis" has been published as part of the Diabetes Strategy Evidence Platform at this URL: http://www.health.gov.on.ca/english/providers/program/mas/tech/ohtas/tech_diabetes_20091020.htmlPLEASE VISIT THE TORONTO HEALTH ECONOMICS AND TECHNOLOGY ASSESSMENT COLLABORATIVE WEB SITE AT: http

  11. Principle-based analysis of the concept of telecare.

    PubMed

    Solli, Hilde; Bjørk, Ida Torunn; Hvalvik, Sigrun; Hellesø, Ragnhild

    2012-12-01

    To report a concept analysis of telecare. Lately telecare has become a worldwide, modern way of giving care over distance by means of technology. Other concepts, like telemedicine, e-health, and telehealth, focus on the same topic though the boundaries between them seem to be blurred. Sources comprise 44 English language research articles retrieved from the database of Medline and Cinahl (1995-October 2011). Literature Review. A principle-based analysis was undertaken through content analysis of the definitions, attributes, preconditions, and outcomes of the concept. The attributes are well described according to the use of technology, caring activity, persons involved, and accessibility. Preconditions and outcomes are well described concerning individual and health political needs and benefits. The concept did not hold its boundaries through theoretical integration with the concept of telemedicine and telehealth. The definition of telecare competes with concepts like home-based e-health, telehomecare, telephonecare, telephone-based psychosocial services, telehealth, and telemedicine. Assessment of the definitions resulted in a suggestion of a new definition: Telecare is the use of information, communication, and monitoring technologies which allow healthcare providers to remotely evaluate health status, give educational intervention, or deliver health and social care to patients in their homes. The logical principle was assessed to be partly immature, whereas the pragmatical and linguistical principles were found to be mature. A new definition is suggested and this has moved the epistemological principle forward to maturity. © 2012 Blackwell Publishing Ltd.

  12. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  13. Performance analysis of mini-propellers based on FlightGear

    NASA Astrophysics Data System (ADS)

    Vogeltanz, Tomáš

    2016-06-01

    This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.

  14. New Oligocene primate from Saudi Arabia and the divergence of apes and Old World monkeys.

    PubMed

    Zalmout, Iyad S; Sanders, William J; Maclatchy, Laura M; Gunnell, Gregg F; Al-Mufarreh, Yahya A; Ali, Mohammad A; Nasser, Abdul-Azziz H; Al-Masari, Abdu M; Al-Sobhi, Salih A; Nadhra, Ayman O; Matari, Adel H; Wilson, Jeffrey A; Gingerich, Philip D

    2010-07-15

    It is widely understood that Hominoidea (apes and humans) and Cercopithecoidea (Old World monkeys) have a common ancestry as Catarrhini deeply rooted in Afro-Arabia. The oldest stem Catarrhini in the fossil record are Propliopithecoidea, known from the late Eocene to early Oligocene epochs (roughly 35-30 Myr ago) of Egypt, Oman and possibly Angola. Genome-based estimates for divergence of hominoids and cercopithecoids range into the early Oligocene; however, the mid-to-late Oligocene interval from 30 to 23 Myr ago has yielded little fossil evidence documenting the morphology of the last common ancestor of hominoids and cercopithecoids, the timing of their divergence, or the relationship of early stem and crown catarrhines. Here we describe the partial cranium of a new medium-sized (about 15-20 kg) fossil catarrhine, Saadanius hijazensis, dated to 29-28 Myr ago. Comparative anatomy and cladistic analysis shows that Saadanius is an advanced stem catarrhine close to the base of the hominoid-cercopithecoid clade. Saadanius is important for assessing competing hypotheses about the ancestral morphotype for crown catarrhines, early catarrhine phylogeny and the age of hominoid-cercopithecoid divergence. Saadanius has a tubular ectotympanic but lacks synapomorphies of either group of crown Catarrhini, and we infer that the hominoid-cercopithecoid split happened later, between 29-28 and 24 Myr ago.

  15. Identification and phylogeny of Arabian snakes: Comparison of venom chromatographic profiles versus 16S rRNA gene sequences.

    PubMed

    Al Asmari, Abdulrahman; Manthiri, Rajamohammed Abbas; Khan, Haseeb Ahmad

    2014-11-01

    Identification of snake species is important for various reasons including the emergency treatment of snake bite victims. We present a simple method for identification of six snake species using the gel filtration chromatographic profiles of their venoms. The venoms of Echis coloratus, Echis pyramidum, Cerastes gasperettii, Bitis arietans, Naja arabica, and Walterinnesia aegyptia were milked, lyophilized, diluted and centrifuged to separate the mucus from the venom. The clear supernatants were filtered and chromatographed on fast protein liquid chromatography (FPLC). We obtained the 16S rRNA gene sequences of the above species and performed phylogenetic analysis using the neighbor-joining method. The chromatograms of venoms from different snake species showed peculiar patterns based on the number and location of peaks. The dendrograms generated from similarity matrix based on the presence/absence of particular chromatographic peaks clearly differentiated Elapids from Viperids. Molecular cladistics using 16S rRNA gene sequences resulted in jumping clades while separating the members of these two families. These findings suggest that chromatographic profiles of snake venoms may provide a simple and reproducible chemical fingerprinting method for quick identification of snake species. However, the validation of this methodology requires further studies on large number of specimens from within and across species.

  16. Phylogenetic analyses of Andromedeae (Ericaceae subfam. Vaccinioideae).

    PubMed

    Kron, K A; Judd, W S; Crayn, D M

    1999-09-01

    Phylogenetic relationships within the Andromedeae and closely related taxa were investigated by means of cladistic analyses based on phenotypic (morphology, anatomy, chromosome number, and secondary chemistry) and molecular (rbcL and matK nucleotide sequences) characters. An analysis based on combined molecular and phenotypic characters indicates that the tribe is composed of two major clades-the Gaultheria group (incl. Andromeda, Chamaedaphne, Diplycosia, Gaultheria, Leucothoë, Pernettya, Tepuia, and Zenobia) and the Lyonia group (incl. Agarista, Craibiodendron, Lyonia, and Pieris). Andromedeae are shown to be paraphyletic in all analyses because the Vaccinieae link with some or all of the genera of the Gaultheria group. Oxydendrum is sister to the clade containing the Vaccinieae, Gaultheria group, and Lyonia group. The monophyly of Agarista, Lyonia, Pieris, and Gaultheria (incl. Pernettya) is supported, while that of Leucothoë is problematic. The close relationship of Andromeda and Zenobia is novel and was strongly supported in the molecular (but not morphological) analyses. Diplycosia, Tepuia, Gaultheria, and Pernettya form a well-supported clade, which can be diagnosed by the presence of fleshy calyx lobes and methyl salicylate. Recognition of Andromedeae is not reflective of our understanding of geneological relationships and should be abandoned; the Lyonia group is formally recognized at the tribal level.

  17. Exploratory Analysis of Supply Chains in the Defense Industrial Base

    DTIC Science & Technology

    2012-04-01

    Instruments Industry Group 382: Laboratory Apparatus and Analytical, Optical, Measuring, and Controlling Instruments 3821 Laboratory Apparatus and Furniture ...I N S T I T U T E F O R D E F E N S E A N A LY S E S Exploratory Analysis of Supply Chains in the Defense Industrial Base James R. Dominy...contract DASW01-04-C-0003, AH-7-3315, “Exploratory Analysis of Supply Chains in the Defense Industrial Base,” for the Director, Industrial Policy. The

  18. Knowledge-based low-level image analysis for computer vision systems

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  19. Connotations of pixel-based scale effect in remote sensing and the modified fractal-based analysis method

    NASA Astrophysics Data System (ADS)

    Feng, Guixiang; Ming, Dongping; Wang, Min; Yang, Jianyu

    2017-06-01

    Scale problems are a major source of concern in the field of remote sensing. Since the remote sensing is a complex technology system, there is a lack of enough cognition on the connotation of scale and scale effect in remote sensing. Thus, this paper first introduces the connotations of pixel-based scale and summarizes the general understanding of pixel-based scale effect. Pixel-based scale effect analysis is essentially important for choosing the appropriate remote sensing data and the proper processing parameters. Fractal dimension is a useful measurement to analysis pixel-based scale. However in traditional fractal dimension calculation, the impact of spatial resolution is not considered, which leads that the scale effect change with spatial resolution can't be clearly reflected. Therefore, this paper proposes to use spatial resolution as the modified scale parameter of two fractal methods to further analyze the pixel-based scale effect. To verify the results of two modified methods (MFBM (Modified Windowed Fractal Brownian Motion Based on the Surface Area) and MDBM (Modified Windowed Double Blanket Method)); the existing scale effect analysis method (information entropy method) is used to evaluate. And six sub-regions of building areas and farmland areas were cut out from QuickBird images to be used as the experimental data. The results of the experiment show that both the fractal dimension and information entropy present the same trend with the decrease of spatial resolution, and some inflection points appear at the same feature scales. Further analysis shows that these feature scales (corresponding to the inflection points) are related to the actual sizes of the geo-object, which results in fewer mixed pixels in the image, and these inflection points are significantly indicative of the observed features. Therefore, the experiment results indicate that the modified fractal methods are effective to reflect the pixel-based scale effect existing in remote sensing

  20. A knowledge base for Vitis vinifera functional analysis.

    PubMed

    Pulvirenti, Alfredo; Giugno, Rosalba; Distefano, Rosario; Pigola, Giuseppe; Mongiovi, Misael; Giudice, Girolamo; Vendramin, Vera; Lombardo, Alessandro; Cattonaro, Federica; Ferro, Alfredo

    2015-01-01

    Vitis vinifera (Grapevine) is the most important fruit species in the modern world. Wine and table grapes sales contribute significantly to the economy of major wine producing countries. The most relevant goals in wine production concern quality and safety. In order to significantly improve the achievement of these objectives and to gain biological knowledge about cultivars, a genomic approach is the most reliable strategy. The recent grapevine genome sequencing offers the opportunity to study the potential roles of genes and microRNAs in fruit maturation and other physiological and pathological processes. Although several systems allowing the analysis of plant genomes have been reported, none of them has been designed specifically for the functional analysis of grapevine genomes of cultivars under environmental stress in connection with microRNA data. Here we introduce a novel knowledge base, called BIOWINE, designed for the functional analysis of Vitis vinifera genomes of cultivars present in Sicily. The system allows the analysis of RNA-seq experiments of two different cultivars, namely Nero d'Avola and Nerello Mascalese. Samples were taken under different climatic conditions of phenological phases, diseases, and geographic locations. The BIOWINE web interface is equipped with data analysis modules for grapevine genomes. In particular users may analyze the current genome assembly together with the RNA-seq data through a customized version of GBrowse. The web interface allows users to perform gene set enrichment by exploiting third-party databases. BIOWINE is a knowledge base implementing a set of bioinformatics tools for the analysis of grapevine genomes. The system aims to increase our understanding of the grapevine varieties and species of Sicilian products focusing on adaptability to different climatic conditions, phenological phases, diseases, and geographic locations.

  1. Terminal Restriction Fragment Length Polymorphism Analysis Program, a Web-Based Research Tool for Microbial Community Analysis

    PubMed Central

    Marsh, Terence L.; Saxman, Paul; Cole, James; Tiedje, James

    2000-01-01

    Rapid analysis of microbial communities has proven to be a difficult task. This is due, in part, to both the tremendous diversity of the microbial world and the high complexity of many microbial communities. Several techniques for community analysis have emerged over the past decade, and most take advantage of the molecular phylogeny derived from 16S rRNA comparative sequence analysis. We describe a web-based research tool located at the Ribosomal Database Project web site (http://www.cme.msu.edu/RDP/html/analyses.html) that facilitates microbial community analysis using terminal restriction fragment length polymorphism of 16S ribosomal DNA. The analysis function (designated TAP T-RFLP) permits the user to perform in silico restriction digestions of the entire 16S sequence database and derive terminal restriction fragment sizes, measured in base pairs, from the 5′ terminus of the user-specified primer to the 3′ terminus of the restriction endonuclease target site. The output can be sorted and viewed either phylogenetically or by size. It is anticipated that the site will guide experimental design as well as provide insight into interpreting results of community analysis with terminal restriction fragment length polymorphisms. PMID:10919828

  2. Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch

    2015-12-15

    The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less

  3. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Research of second harmonic generation images based on texture analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  5. The characters of Palaeozoic jawed vertebrates

    PubMed Central

    Brazeau, Martin D; Friedman, Matt

    2014-01-01

    Newly discovered fossils from the Silurian and Devonian periods are beginning to challenge embedded perceptions about the origin and early diversification of jawed vertebrates (gnathostomes). Nevertheless, an explicit cladistic framework for the relationships of these fossils relative to the principal crown lineages of the jawed vertebrates (osteichthyans: bony fishes and tetrapods; chondrichthyans: sharks, batoids, and chimaeras) remains elusive. We critically review the systematics and character distributions of early gnathostomes and provide a clearly stated hierarchy of synapomorphies covering the jaw-bearing stem gnathostomes and osteichthyan and chondrichthyan stem groups. We show that character lists, designed to support the monophyly of putative groups, tend to overstate their strength and lack cladistic corroboration. By contrast, synapomorphic hierarchies are more open to refutation and must explicitly confront conflicting evidence. Our proposed synapomorphy scheme is used to evaluate the status of the problematic fossil groups Acanthodii and Placodermi, and suggest profitable avenues for future research. We interpret placoderms as a paraphyletic array of stem-group gnathostomes, and suggest what we regard as two equally plausible placements of acanthodians: exclusively on the chondrichthyan stem, or distributed on both the chondrichthyan and osteichthyan stems. PMID:25750460

  6. Wheeze sound analysis using computer-based techniques: a systematic review.

    PubMed

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  7. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  8. Language-Based Curriculum Analysis: A Collaborative Assessment and Intervention Process.

    ERIC Educational Resources Information Center

    Prelock, Patricia A.

    1997-01-01

    Presents a systematic process for completing a language-based curriculum analysis to address curriculum expectations that may challenge students with communication impairments. Analysis of vocabulary and the demands for comprehension, oral, and written expression within specific content areas provides a framework for collaboration between teachers…

  9. Ontology-based specification, identification and analysis of perioperative risks.

    PubMed

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  10. Geographic Object-Based Image Analysis - Towards a new paradigm.

    PubMed

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  11. Moon-Based INSAR Geolocation and Baseline Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  12. Cost Analysis of an Office-based Surgical Suite

    PubMed Central

    LaBove, Gabrielle

    2016-01-01

    Introduction: Operating costs are a significant part of delivering surgical care. Having a system to analyze these costs is imperative for decision making and efficiency. We present an analysis of surgical supply, labor and administrative costs, and remuneration of procedures as a means for a practice to analyze their cost effectiveness; this affects the quality of care based on the ability to provide services. The costs of surgical care cannot be estimated blindly as reconstructive and cosmetic procedures have different percentages of overhead. Methods: A detailed financial analysis of office-based surgical suite costs for surgical procedures was determined based on company contract prices and average use of supplies. The average time spent on scheduling, prepping, and doing the surgery was factored using employee rates. Results: The most expensive, minor procedure supplies are suture needles. The 4 most common procedures from the most expensive to the least are abdominoplasty, breast augmentation, facelift, and lipectomy. Conclusions: Reconstructive procedures require a greater portion of collection to cover costs. Without the adjustment of both patient and insurance remuneration in the practice, the ability to provide quality care will be increasingly difficult. PMID:27536482

  13. An Analysis of Base Pressure at Supersonic Velocities and Comparison with Experiment

    NASA Technical Reports Server (NTRS)

    Chapman, Dean R

    1951-01-01

    In the first part of the investigation an analysis is made of base pressure in an inviscid fluid, both for two-dimensional and axially symmetric flow. It is shown that for two-dimensional flow, and also for the flow over a body of revolution with a cylindrical sting attached to the base, there are an infinite number of possible solutions satisfying all necessary boundary conditions at any given free-stream Mach number. For the particular case of a body having no sting attached only one solution is possible in an inviscid flow, but it corresponds to zero base drag. Accordingly, it is concluded that a strictly inviscid-fluid theory cannot be satisfactory for practical applications. An approximate semi-empirical analysis for base pressure in a viscous fluid is developed in a second part of the investigation. The semi-empirical analysis is based partly on inviscid-flow calculations.

  14. Particle Pollution Estimation Based on Image Analysis

    PubMed Central

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  15. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction.

  16. Sub-pattern based multi-manifold discriminant analysis for face recognition

    NASA Astrophysics Data System (ADS)

    Dai, Jiangyan; Guo, Changlu; Zhou, Wei; Shi, Yanjiao; Cong, Lin; Yi, Yugen

    2018-04-01

    In this paper, we present a Sub-pattern based Multi-manifold Discriminant Analysis (SpMMDA) algorithm for face recognition. Unlike existing Multi-manifold Discriminant Analysis (MMDA) approach which is based on holistic information of face image for recognition, SpMMDA operates on sub-images partitioned from the original face image and then extracts the discriminative local feature from the sub-images separately. Moreover, the structure information of different sub-images from the same face image is considered in the proposed method with the aim of further improve the recognition performance. Extensive experiments on three standard face databases (Extended YaleB, CMU PIE and AR) demonstrate that the proposed method is effective and outperforms some other sub-pattern based face recognition methods.

  17. Likelihood-Based Random-Effect Meta-Analysis of Binary Events.

    PubMed

    Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D

    2015-01-01

    Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.

  18. Evolutionary diversification of secondary mechanoreceptor cells in tunicata.

    PubMed

    Rigon, Francesca; Stach, Thomas; Caicci, Federico; Gasparini, Fabio; Burighel, Paolo; Manni, Lucia

    2013-06-04

    Hair cells are vertebrate secondary sensory cells located in the ear and in the lateral line organ. Until recently, these cells were considered to be mechanoreceptors exclusively found in vertebrates that evolved within this group. Evidence of secondary mechanoreceptors in some tunicates, the proposed sister group of vertebrates, has recently led to the hypothesis that vertebrate and tunicate secondary sensory cells share a common origin. Secondary sensory cells were described in detail in two tunicate groups, ascidians and thaliaceans, in which they constitute an oral sensory structure called the coronal organ. Among thaliaceans, the organ is absent in salps and it has been hypothesised that this condition is due to a different feeding system adopted by this group of animals. No information is available as to whether a comparable structure exists in the third group of tunicates, the appendicularians, although different sensory structures are known to be present in these animals. We studied the detailed morphology of appendicularian oral mechanoreceptors. Using light and electron microscopy we could demonstrate that the mechanosensory organ called the circumoral ring is composed of secondary sensory cells. We described the ultrastructure of the circumoral organ in two appendicularian species, Oikopleura dioica and Oikopleura albicans, and thus taxonomically completed the data collection of tunicate secondary sensory cells. To understand the evolution of secondary sensory cells in tunicates, we performed a cladistic analysis using morphological data. We constructed a matrix consisting of 19 characters derived from detailed ultrastructural studies in 16 tunicate species and used a cephalochordate and three vertebrate species as outgroups. Our study clearly shows that the circumoral ring is the appendicularian homologue of the coronal organ of other tunicate taxa. The cladistic analysis enabled us to reconstruct the features of the putative ancestral hair cell in

  19. Real-time image annotation by manifold-based biased Fisher discriminant analysis

    NASA Astrophysics Data System (ADS)

    Ji, Rongrong; Yao, Hongxun; Wang, Jicheng; Sun, Xiaoshuai; Liu, Xianming

    2008-01-01

    Automatic Linguistic Annotation is a promising solution to bridge the semantic gap in content-based image retrieval. However, two crucial issues are not well addressed in state-of-art annotation algorithms: 1. The Small Sample Size (3S) problem in keyword classifier/model learning; 2. Most of annotation algorithms can not extend to real-time online usage due to their low computational efficiencies. This paper presents a novel Manifold-based Biased Fisher Discriminant Analysis (MBFDA) algorithm to address these two issues by transductive semantic learning and keyword filtering. To address the 3S problem, Co-Training based Manifold learning is adopted for keyword model construction. To achieve real-time annotation, a Bias Fisher Discriminant Analysis (BFDA) based semantic feature reduction algorithm is presented for keyword confidence discrimination and semantic feature reduction. Different from all existing annotation methods, MBFDA views image annotation from a novel Eigen semantic feature (which corresponds to keywords) selection aspect. As demonstrated in experiments, our manifold-based biased Fisher discriminant analysis annotation algorithm outperforms classical and state-of-art annotation methods (1.K-NN Expansion; 2.One-to-All SVM; 3.PWC-SVM) in both computational time and annotation accuracy with a large margin.

  20. Resting-State Seed-Based Analysis: An Alternative to Task-Based Language fMRI and Its Laterality Index.

    PubMed

    Smitha, K A; Arun, K M; Rajesh, P G; Thomas, B; Kesavadas, C

    2017-06-01

    Language is a cardinal function that makes human unique. Preservation of language function poses a great challenge for surgeons during resection. The aim of the study was to assess the efficacy of resting-state fMRI in the lateralization of language function in healthy subjects to permit its further testing in patients who are unable to perform task-based fMRI. Eighteen healthy right-handed volunteers were prospectively evaluated with resting-state fMRI and task-based fMRI to assess language networks. The laterality indices of Broca and Wernicke areas were calculated by using task-based fMRI via a voxel-value approach. We adopted seed-based resting-state fMRI connectivity analysis together with parameters such as amplitude of low-frequency fluctuation and fractional amplitude of low-frequency fluctuation (fALFF). Resting-state fMRI connectivity maps for language networks were obtained from Broca and Wernicke areas in both hemispheres. We performed correlation analysis between the laterality index and the z scores of functional connectivity, amplitude of low-frequency fluctuation, and fALFF. Pearson correlation analysis between signals obtained from the z score of fALFF and the laterality index yielded a correlation coefficient of 0.849 ( P < .05). Regression analysis of the fALFF with the laterality index yielded an R 2 value of 0.721, indicating that 72.1% of the variance in the laterality index of task-based fMRI could be predicted from the fALFF of resting-state fMRI. The present study demonstrates that fALFF can be used as an alternative to task-based fMRI for assessing language laterality. There was a strong positive correlation between the fALFF of the Broca area of resting-state fMRI with the laterality index of task-based fMRI. Furthermore, we demonstrated the efficacy of fALFF for predicting the laterality of task-based fMRI. © 2017 by American Journal of Neuroradiology.

  1. Student Engagement: A Principle-Based Concept Analysis.

    PubMed

    Bernard, Jean S

    2015-08-04

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.

  2. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  3. Inquiry-Based Approach to a Carbohydrate Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Senkbeil, Edward G.

    1999-01-01

    The analysis of an unknown carbohydrate in an inquiry-based learning format has proven to be a valuable and interesting undergraduate biochemistry laboratory experiment. Students are given a list of carbohydrates and a list of references for carbohydrate analysis. The references contain a variety of well-characterized wet chemistry and instrumental techniques for carbohydrate identification, but the students must develop an appropriate sequential protocol for unknown identification. The students are required to provide a list of chemicals and procedures and a flow chart for identification before the lab. During the 3-hour laboratory period, they utilize their accumulated information and knowledge to classify and identify their unknown. Advantages of the inquiry-based format are (i) students must be well prepared in advance to be successful in the laboratory, (ii) students feel a sense of accomplishment in both designing and carrying out a successful experiment, and (iii) the carbohydrate background information digested by the students significantly decreases the amount of lecture time required for this topic.

  4. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  5. NASA Lunar Base Wireless System Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  6. CASKS (Computer Analysis of Storage Casks): A microcomputer based analysis system for storage cask review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1996-12-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules--the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impactmore » analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage asks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less

  7. Casks (computer analysis of storage casks): A microcomputer based analysis system for storage cask review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1995-08-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on themore » impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less

  8. Perinatal Bereavement: A Principle-based Concept Analysis

    PubMed Central

    FENSTERMACHER, Kimberly; HUPCEY, Judith E.

    2013-01-01

    Aim This paper is a report of an analysis of the concept of perinatal bereavement. Background The concept of perinatal bereavement emerged in the scientific literature during the 1970s. Perinatal bereavement is a practice based concept, although it is not well defined in the scientific literature and is often intermingled with the concepts of mourning and grief. Design Concept Analysis. Data Sources Using the term ‘perinatal bereavement’ and limits of only English and human, Pub Med and CINAHL were searched to yield 278 available references dating from 1974 – 2011. Articles specific to the experience of perinatal bereavement were reviewed. The final data set was 143 articles. Review Methods The methods of principle-based concept analysis were used. Results reveal conceptual components (antecedents, attributes and outcomes) which are delineated to create a theoretical definition of perinatal bereavement. Results The concept is epistemologically immature, with few explicit definitions to describe the phenomenon. Inconsistency in conceptual meaning threatens the construct validity of measurement tools for perinatal bereavement and contributes to incongruent theoretical definitions. This has implications for both nursing science (how the concept is studied and theoretically integrated) and clinical practice (timing and delivery of support interventions). Conclusions Perinatal bereavement is a multifaceted global phenomenon that follows perinatal loss. Lack of conceptual clarity and lack of a clearly articulated conceptual definition impede the synthesis and translation of research findings into practice. A theoretical definition of perinatal bereavement is offered as a platform for researchers to advance the concept through research and theory development. PMID:23458030

  9. Prison-Based Educational Programs: A Content Analysis of Government Documents

    ERIC Educational Resources Information Center

    Piotrowski, Chris; Lathrop, Peter J.

    2012-01-01

    The literature provides limited, constructive, consensus-based information to correctional officials and administrators on the efficacy of prison-based programs. This study reports an analysis of 8 review government documents, that surveyed the research literature from 1980-2008, on the topic of educational rehabilitation programs available to…

  10. Methodology for cost analysis of film-based and filmless portable chest systems

    NASA Astrophysics Data System (ADS)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  11. Molecular phylogeny of the hominoid primates as indicated by two-dimensional protein electrophoresis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldman, D.; Giri, P.R.; O'Brien, J.O.

    1987-05-01

    A molecular phylogeny for the hominoid primates was constructed by using genetic distances from a survey of 383 radiolabeled fibroblast polypeptides resolved by two-dimensional electrophoresis (2DE). An internally consistent matrix of Nei genetic distances was generated on the basis of variants in electrophoretic position. The derived phylogenetic tree indicated a branching sequence, from oldest to most recent, of cercopithecoids (Macaca fascicularis), gibbon-siamang, orangutan, gorilla, and human-chimpanzee. A cladistic analysis of 240 electrophoretic characters that varied between ape species produced an identical tree. Genetic distance measures obtained by 2DE are largely consistent with those generated by other molecular procedures. In addition,more » the 2DE data set appears to resolve the human-chimpanzee-gorilla trichotomy in favor of a more recent association of chimpanzees and humans.« less

  12. A new crustacean from the Herefordshire (Silurian) Lagerstätte, UK, and its significance in malacostracan evolution

    PubMed Central

    Briggs, Derek E. G.; Siveter, Derek J.; Sutton, Mark D.; Legg, David

    2017-01-01

    Cascolus ravitis gen. et sp. nov. is a three-dimensionally preserved fossil crustacean with soft parts from the Herefordshire (Silurian) Lagerstätte, UK. It is characterized by a head with a head shield and five limb pairs, and a thorax (pereon) with nine appendage-bearing segments followed by an apodous abdomen (pleon). All the appendages except the first are biramous and have a gnathobase. The post-mandibular appendages are similar one to another, and bear petal-shaped epipods that probably functioned as a part of the respiratory–circulatory system. Cladistic analysis resolves the new taxon as a stem-group leptostracan (Malacostraca). This well-preserved arthropod provides novel insights into the evolution of appendage morphology, tagmosis and the possible respiratory–circulatory physiology of a basal malacostracan. PMID:28330926

  13. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals.

    PubMed

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A

    2016-03-05

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.

  14. Quaternion-Based Signal Analysis for Motor Imagery Classification from Electroencephalographic Signals

    PubMed Central

    Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.

    2016-01-01

    Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029

  15. An Analysis of the Formal Features of "Reality-Based" Television Programs.

    ERIC Educational Resources Information Center

    Neapolitan, D. M.

    Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…

  16. Simulation-based training for nurses: Systematic review and meta-analysis.

    PubMed

    Hegland, Pål A; Aarlie, Hege; Strømme, Hilde; Jamtvedt, Gro

    2017-07-01

    Simulation-based training is a widespread strategy to improve health-care quality. However, its effect on registered nurses has previously not been established in systematic reviews. The aim of this systematic review is to evaluate effect of simulation-based training on nurses' skills and knowledge. We searched CDSR, DARE, HTA, CENTRAL, CINAHL, MEDLINE, Embase, ERIC, and SveMed+ for randomised controlled trials (RCT) evaluating effect of simulation-based training among nurses. Searches were completed in December 2016. Two reviewers independently screened abstracts and full-text, extracted data, and assessed risk of bias. We compared simulation-based training to other learning strategies, high-fidelity simulation to other simulation strategies, and different organisation of simulation training. Data were analysed through meta-analysis and narrative syntheses. GRADE was used to assess the quality of evidence. Fifteen RCTs met the inclusion criteria. For the comparison of simulation-based training to other learning strategies on nurses' skills, six studies in the meta-analysis showed a significant, but small effect in favour of simulation (SMD -1.09, CI -1.72 to -0.47). There was large heterogeneity (I 2 85%). For the other comparisons, there was large between-study variation in results. The quality of evidence for all comparisons was graded as low. The effect of simulation-based training varies substantially between studies. Our meta-analysis showed a significant effect of simulation training compared to other learning strategies, but the quality of evidence was low indicating uncertainty. Other comparisons showed inconsistency in results. Based on our findings simulation training appears to be an effective strategy to improve nurses' skills, but further good-quality RCTs with adequate sample sizes are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Adequation between environmental shifts and human speciations during the Plio-PleistoceneAdéquation entre changements environnementaux et spéciations humaines au Plio-Pléistocène

    NASA Astrophysics Data System (ADS)

    Zeitoun, Valery

    2000-01-01

    Several synthetic studies provide evidence of faunistic and floristic shifts as consequences of global climatic events around 2.5 and 1.8 Ma. Some authors are more confident in the role played by local tectonic events to explain such a change in terms of vicariance. It is possible that astronomic, tectonic and climatic variations are linked, as the lithosphere is a catalyst for biosphere more or less strong according to the period. The purpose of this paper is to confront independent environmental and anatomical data to try to understand the evolutionary process in hominids, with particular emphasis on species of the genus Homo in East Africa during the Plio-Pleistocene period. Thus the result of a cladistic analysis based on 35 Otus and 468 features of the calvaria is showing the onset of at least four species to describe the grade of Homo habilis, living in East Africa. The radiation is congruent with the contemporaneous environmental shifts around 2.5 Ma, as is also the case for bovids or cercopithecids. Some Homo erectus left Africa when another climatic pulse happened around 1.8 Ma.

  18. A new species of Euprox (Cervidae, Artiodactyla) from the upper Miocene of the Linxia Basin, Gansu Province, China, with interpretation of its paleoenvironment.

    PubMed

    Hou, Sukuan

    2015-01-16

    The Linxia Basin, Gansu Province, China, is known for its abundant and well preserved fossils. Here a new species, Euprox grandis sp. nov., is established based on a skull and antlers collected from the upper Miocene Liushu Formation of the Linxia Basin. The new species is distinguishable from other Euprox species by its large body size, notably long pedicle and weak burr. The main beam and the brow tine are slightly curved both medially and backwards, and the apex of the main beam turns, curving slightly laterally. The upper cheek teeth are brachydont, with a clear central fold on the premolars and internal postprotocrista and metaconule fold on M1-M2. The cingulum is almost absent, only occasionally weakly developed at the anterior and lingual surface of the teeth. Cladistic analysis was carried out using the TNT software, and two most parsimonious trees were retained. As the strict consensus tree shows E. grandis appears to be an advanced muntiacine form, which may have a close relationship with the genus Muntiacus. The presence of E. grandis in the Linxia Basin adds new evidence to support a warm and humid environment during the late Miocene in the basin. 

  19. Antimicrobial peptides from the skins of North American frogs.

    PubMed

    Conlon, J Michael; Kolodziejek, Jolanta; Nowotny, Norbert

    2009-08-01

    North America is home to anuran species belonging to the families Bufonidae, Eleutherodactylidae, Hylidae, Leiopelmatidae, Ranidae, and Scaphiopodidae but antimicrobial peptides have been identified only in skin secretions and/or skin extracts of frogs belonging to the Leiopelmatidae ("tailed frogs") and Ranidae ("true frogs"). Eight structurally-related cationic alpha-helical peptides with broad-spectrum antibacterial activity, termed ascaphins, have been isolated from specimens of Ascaphus truei (Leiopelmatidae) occupying a coastal range. Characterization of orthologous antimicrobial peptides from Ascaphus specimens occupying an inland range supports the proposal that this population should be regarded as a separate species A. montanus. Ascaphin-8 shows potential for development into a therapeutically valuable anti-infective agent. Peptides belonging to the brevinin-1, esculentin-1, esculentin-2, palustrin-1, palustrin-2, ranacyclin, ranatuerin-1, ranatuerin-2, and temporin families have been isolated from North American ranids. It is proposed that "ranalexins" represent brevinin-1 peptides that have undergone a four amino acid residue internal deletion. Current taxonomic recommendations divide North American frogs from the family Ranidae into two genera: Lithobates and Rana. Cladistic analysis based upon the amino acid sequences of the brevinin-1 peptides provides strong support for this assignment.

  20. Applications of pathology-assisted image analysis of immunohistochemistry-based biomarkers in oncology.

    PubMed

    Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D

    2014-01-01

    Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.

  1. Evaluating Web-Based Nursing Education's Effects: A Systematic Review and Meta-Analysis.

    PubMed

    Kang, Jiwon; Seomun, GyeongAe

    2017-09-01

    This systematic review and meta-analysis investigated whether using web-based nursing educational programs increases a participant's knowledge and clinical performance. We performed a meta-analysis of studies published between January 2000 and July 2016 and identified through RISS, CINAHL, ProQuest Central, Embase, the Cochrane Library, and PubMed. Eleven studies were eligible for inclusion in this analysis. The results of the meta-analysis demonstrated significant differences not only for the overall effect but also specifically for blended programs and short (2 weeks or 4 weeks) intervention periods. To present more evidence supporting the effectiveness of web-based nursing educational programs, further research is warranted.

  2. Sequence information gain based motif analysis.

    PubMed

    Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre

    2015-11-09

    The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.

  3. Developing group investigation-based book on numerical analysis to increase critical thinking student’s ability

    NASA Astrophysics Data System (ADS)

    Maharani, S.; Suprapto, E.

    2018-03-01

    Critical thinking is very important in Mathematics; it can make student more understanding mathematics concept. Critical thinking is also needed in numerical analysis. The Numerical analysis's book is not yet including critical thinking in them. This research aims to develop group investigation-based book on numerical analysis to increase critical thinking student’s ability, to know the quality of the group investigation-based book on numerical analysis is valid, practical, and effective. The research method is Research and Development (R&D) with the subject are 30 student college department of Mathematics education at Universitas PGRI Madiun. The development model used is 4-D modified to 3-D until the stage development. The type of data used is descriptive qualitative data. Instruments used are sheets of validation, test, and questionnaire. Development results indicate that group investigation-based book on numerical analysis in the category of valid a value 84.25%. Students response to the books very positive, so group investigation-based book on numerical analysis category practical, i.e., 86.00%. The use of group investigation-based book on numerical analysis has been meeting the completeness criteria classical learning that is 84.32 %. Based on research result of this study concluded that group investigation-based book on numerical analysis is feasible because it meets the criteria valid, practical, and effective. So, the book can be used by every mathematics academician. The next research can be observed that book based group investigation in other subjects.

  4. A Rational Analysis of Rule-Based Concept Learning

    ERIC Educational Resources Information Center

    Goodman, Noah D.; Tenenbaum, Joshua B.; Feldman, Jacob; Griffiths, Thomas L.

    2008-01-01

    This article proposes a new model of human concept learning that provides a rational analysis of learning feature-based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space--a concept language of logical rules. This article compares the model predictions to human generalization judgments in several…

  5. An interactive web-based application for Comprehensive Analysis of RNAi-screen Data.

    PubMed

    Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B; Germain, Ronald N; Smith, Jennifer A; Simpson, Kaylene J; Martin, Scott E; Buehler, Eugen; Beuhler, Eugen; Fraser, Iain D C

    2016-02-23

    RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment.

  6. An interactive web-based application for Comprehensive Analysis of RNAi-screen Data

    PubMed Central

    Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B.; Germain, Ronald N.; Smith, Jennifer A.; Simpson, Kaylene J.; Martin, Scott E.; Beuhler, Eugen; Fraser, Iain D. C.

    2016-01-01

    RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment. PMID:26902267

  7. Reduction method with system analysis for multiobjective optimization-based design

    NASA Technical Reports Server (NTRS)

    Azarm, S.; Sobieszczanski-Sobieski, J.

    1993-01-01

    An approach for reducing the number of variables and constraints, which is combined with System Analysis Equations (SAE), for multiobjective optimization-based design is presented. In order to develop a simplified analysis model, the SAE is computed outside an optimization loop and then approximated for use by an operator. Two examples are presented to demonstrate the approach.

  8. Recurrence quantity analysis based on matrix eigenvalues

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian

    2018-06-01

    Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.

  9. [Development of laboratory sequence analysis software based on WWW and UNIX].

    PubMed

    Huang, Y; Gu, J R

    2001-01-01

    Sequence analysis tools based on WWW and UNIX were developed in our laboratory to meet the needs of molecular genetics research in our laboratory. General principles of computer analysis of DNA and protein sequences were also briefly discussed in this paper.

  10. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  11. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    ERIC Educational Resources Information Center

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  12. Geopolitical E-Analysis Based on E-Learning Content

    ERIC Educational Resources Information Center

    Dinicu, Anca; Oancea, Romana

    2017-01-01

    In a world of great complexity, understanding the manner states act and react becomes more and more an intriguing quest due to the multiple relations of dependence and interdependence that characterize "the global puzzle". Within this context, an analysis based on a geopolitical approach becomes a very useful means used to determine not…

  13. Providing web-based tools for time series access and analysis

    NASA Astrophysics Data System (ADS)

    Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane

    2014-05-01

    Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30

  14. Growth habit of the late Paleozoic rhizomorphic tree-lycopsid family Diaphorodendraceae: phylogenetic, evolutionary, and paleoecological significance.

    PubMed

    Dimichele, William A; Elrick, Scott D; Bateman, Richard M

    2013-08-01

    Rhizomorphic lycopsids evolved the tree habit independently of all other land plants. Newly discovered specimens allow radical revision of our understanding of the growth architectures of the extinct Paleozoic sister-genera Synchysidendron and Diaphorodendron. Detailed descriptions of six remarkable adpression specimens from the Pennsylvanian of the USA and three casts from the late Mississippian of Scotland are used to revise and reanalyze a previously published morphological cladistic matrix and to reinterpret their remarkable growth forms. Contrary to previous assertions, Synchysidendron resembled Diaphorodendron in having a distinct and relatively complex growth habit that emphasized serially homologous, closely spaced, deciduous lateral branches at the expense of reduced monocarpic crown branches. Lateral branches originated through several strongly anisotomous dichotomies before producing during extended periods large numbers of Achlamydocarpon strobili. The comparatively large diameter of abscission scars remaining on the main trunk and the emergence of branches above the horizontal plane suggest that the lateral branch systems were robust. Lateral branches were borne in two opposite rows on the main trunk and continued upward into an isotomously branched, determinate crown; their striking distichous arrangement caused preferred orientation of fallen trunks on bedding planes. This discovery identifies the plagiotropic growth habit, dominated by serial lateral branches, as ubiquitous in the Diaphorodendraceae and also as unequivocally primitive within Isoetales s.l., a conclusion supported by both the revised morphological cladistic analysis and relative first appearances of taxa in the fossil record. Previously assumed complete homology between crown branching in Lepidodendraceae and that of all earlier-divergent genera requires reassessment. Saltational phenotypic transitions via modification of key developmental switches remains the most credible

  15. A graph-based network-vulnerability analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example themore » class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.« less

  16. A graph-based network-vulnerability analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the classmore » of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.« less

  17. A graph-based system for network-vulnerability analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks,more » broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.« less

  18. Case Problems for Problem-Based Pedagogical Approaches: A Comparative Analysis

    ERIC Educational Resources Information Center

    Dabbagh, Nada; Dass, Susan

    2013-01-01

    A comparative analysis of 51 case problems used in five problem-based pedagogical models was conducted to examine whether there are differences in their characteristics and the implications of such differences on the selection and generation of ill-structured case problems. The five pedagogical models were: situated learning, goal-based scenario,…

  19. Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.

    2007-01-01

    Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.

  20. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  1. Temperature Based Stress Analysis of Notched Members

    DTIC Science & Technology

    1979-03-01

    Strain Behavior 98 of Mild Steel 17 Percent Restoration vs. Residual Stress 99 18 Examples of a Good Weld and Three 100 Defective Welds vi LIST OF TABLES...measuring temperatures in deforming metals based on the use 27 of thermistor flakes. The system was used to show that more heating occurs near stress...thermocouples were welded to the specimen surface. This particular attachment method is quite suitable for stress analysis for the following reasons

  2. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  3. iTemplate: A template-based eye movement data analysis approach.

    PubMed

    Xiao, Naiqi G; Lee, Kang

    2018-02-08

    Current eye movement data analysis methods rely on defining areas of interest (AOIs). Due to the fact that AOIs are created and modified manually, variances in their size, shape, and location are unavoidable. These variances affect not only the consistency of the AOI definitions, but also the validity of the eye movement analyses based on the AOIs. To reduce the variances in AOI creation and modification and achieve a procedure to process eye movement data with high precision and efficiency, we propose a template-based eye movement data analysis method. Using a linear transformation algorithm, this method registers the eye movement data from each individual stimulus to a template. Thus, users only need to create one set of AOIs for the template in order to analyze eye movement data, rather than creating a unique set of AOIs for all individual stimuli. This change greatly reduces the error caused by the variance from manually created AOIs and boosts the efficiency of the data analysis. Furthermore, this method can help researchers prepare eye movement data for some advanced analysis approaches, such as iMap. We have developed software (iTemplate) with a graphic user interface to make this analysis method available to researchers.

  4. Analysis And Augmentation Of Timing Advance Based Geolocation In Lte Cellular Networks

    DTIC Science & Technology

    2016-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA DISSERTATION ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCATION IN LTE CELLULAR NETWORKS by...estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the...AND SUBTITLE ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCA- TION IN LTE CELLULAR NETWORKS 5. FUNDING NUMBERS 6. AUTHOR(S) John D. Roth 7

  5. Volumetric quantification of bone-implant contact using micro-computed tomography analysis based on region-based segmentation.

    PubMed

    Kang, Sung-Won; Lee, Woo-Jin; Choi, Soon-Chul; Lee, Sam-Sun; Heo, Min-Suk; Huh, Kyung-Hoe; Kim, Tae-Il; Yi, Won-Jin

    2015-03-01

    We have developed a new method of segmenting the areas of absorbable implants and bone using region-based segmentation of micro-computed tomography (micro-CT) images, which allowed us to quantify volumetric bone-implant contact (VBIC) and volumetric absorption (VA). The simple threshold technique generally used in micro-CT analysis cannot be used to segment the areas of absorbable implants and bone. Instead, a region-based segmentation method, a region-labeling method, and subsequent morphological operations were successively applied to micro-CT images. The three-dimensional VBIC and VA of the absorbable implant were then calculated over the entire volume of the implant. Two-dimensional (2D) bone-implant contact (BIC) and bone area (BA) were also measured based on the conventional histomorphometric method. VA and VBIC increased significantly with as the healing period increased (p<0.05). VBIC values were significantly correlated with VA values (p<0.05) and with 2D BIC values (p<0.05). It is possible to quantify VBIC and VA for absorbable implants using micro-CT analysis using a region-based segmentation method.

  6. AR(p) -based detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  7. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    PubMed

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  8. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  9. Paper-Plastic Hybrid Microfluidic Device for Smartphone-Based Colorimetric Analysis of Urine.

    PubMed

    Jalal, Uddin M; Jin, Gyeong Jun; Shim, Joon S

    2017-12-19

    In this work, a disposable paper-plastic hybrid microfluidic lab-on-a-chip (LOC) has been developed and successfully applied for the colorimetric measurement of urine by the smartphone-based optical platform using a "UrineAnalysis" Android app. The developed device was cost-effectively implemented as a stand-alone hybrid LOC by incorporating the paper-based conventional reagent test strip inside the plastic-based LOC microchannel. The LOC device quantitatively investigated the small volume (40 μL) of urine analytes for the colorimetric reaction of glucose, protein, pH, and red blood cell (RBC) in integration with the finger-actuating micropump. On the basis of our experiments, the conventional urine strip showed large deviation as the reaction time goes by, because dipping the strip sensor in a bottle of urine could not control the reaction volume. By integrating the strip sensor in the LOC device for urine analysis, our device significantly improves the time-dependent inconstancy of the conventional dipstick-based urine strip, and the smartphone app used for image analysis enhances the visual assessment of the test strip, which is a major user concern for the colorimetric analysis in point-of-care (POC) applications. As a result, the user-friendly LOC, which is successfully implemented in a disposable format with the smartphone-based optical platform, may be applicable as an effective tool for rapid and qualitative POC urinalysis.

  10. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  11. Principle-based concept analysis: Caring in nursing education

    PubMed Central

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development

  12. Principle-based concept analysis: Caring in nursing education.

    PubMed

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Karimi Moonaghi, Hossein; Mazloom, Seyed Reza

    2016-03-01

    The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as "caring pedagogy," "value-based education," and "teaching excellence," caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development.

  13. Identifying novel glioma associated pathways based on systems biology level meta-analysis.

    PubMed

    Hu, Yangfan; Li, Jinquan; Yan, Wenying; Chen, Jiajia; Li, Yin; Hu, Guang; Shen, Bairong

    2013-01-01

    With recent advances in microarray technology, including genomics, proteomics, and metabolomics, it brings a great challenge for integrating this "-omics" data to analysis complex disease. Glioma is an extremely aggressive and lethal form of brain tumor, and thus the study of the molecule mechanism underlying glioma remains very important. To date, most studies focus on detecting the differentially expressed genes in glioma. However, the meta-analysis for pathway analysis based on multiple microarray datasets has not been systematically pursued. In this study, we therefore developed a systems biology based approach by integrating three types of omics data to identify common pathways in glioma. Firstly, the meta-analysis has been performed to study the overlapping of signatures at different levels based on the microarray gene expression data of glioma. Among these gene expression datasets, 12 pathways were found in GeneGO database that shared by four stages. Then, microRNA expression profiles and ChIP-seq data were integrated for the further pathway enrichment analysis. As a result, we suggest 5 of these pathways could be served as putative pathways in glioma. Among them, the pathway of TGF-beta-dependent induction of EMT via SMAD is of particular importance. Our results demonstrate that the meta-analysis based on systems biology level provide a more useful approach to study the molecule mechanism of complex disease. The integration of different types of omics data, including gene expression microarrays, microRNA and ChIP-seq data, suggest some common pathways correlated with glioma. These findings will offer useful potential candidates for targeted therapeutic intervention of glioma.

  14. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  15. A brain-region-based meta-analysis method utilizing the Apriori algorithm.

    PubMed

    Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao

    2016-05-18

    Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.

  16. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  17. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    PubMed Central

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified. PMID:23766723

  18. Atlas-Based Ventricular Shape Analysis for Understanding Congenital Heart Disease.

    PubMed

    Farrar, Genevieve; Suinesiaputra, Avan; Gilbert, Kathleen; Perry, James C; Hegde, Sanjeet; Marsden, Alison; Young, Alistair A; Omens, Jeffrey H; McCulloch, Andrew D

    2016-12-01

    Congenital heart disease is associated with abnormal ventricular shape that can affect wall mechanics and may be predictive of long-term adverse outcomes. Atlas-based parametric shape analysis was used to analyze ventricular geometries of eight adolescent or adult single-ventricle CHD patients with tricuspid atresia and Fontans. These patients were compared with an "atlas" of non-congenital asymptomatic volunteers, resulting in a set of z-scores which quantify deviations from the control population distribution on a patient-by-patient basis. We examined the potential of these scores to: (1) quantify abnormalities of ventricular geometry in single ventricle physiologies relative to the normal population; (2) comprehensively quantify wall motion in CHD patients; and (3) identify possible relationships between ventricular shape and wall motion that may reflect underlying functional defects or remodeling in CHD patients. CHD ventricular geometries at end-diastole and end-systole were individually compared with statistical shape properties of an asymptomatic population from the Cardiac Atlas Project. Shape analysis-derived model properties, and myocardial wall motions between end-diastole and end-systole, were compared with physician observations of clinical functional parameters. Relationships between altered shape and altered function were evaluated via correlations between atlas-based shape and wall motion scores. Atlas-based shape analysis identified a diverse set of specific quantifiable abnormalities in ventricular geometry or myocardial wall motion in all subjects. Moreover, this initial cohort displayed significant relationships between specific shape abnormalities such as increased ventricular sphericity and functional defects in myocardial deformation, such as decreased long-axis wall motion. These findings suggest that atlas-based ventricular shape analysis may be a useful new tool in the management of patients with CHD who are at risk of impaired ventricular

  19. Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.

    PubMed

    Fabian, Heinz; Lasch, Peter; Naumann, Dieter

    2005-01-01

    In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.

  20. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    PubMed

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. Visual saliency detection based on in-depth analysis of sparse representation

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Shen, Siqiu; Ning, Chen

    2018-03-01

    Visual saliency detection has been receiving great attention in recent years since it can facilitate a wide range of applications in computer vision. A variety of saliency models have been proposed based on different assumptions within which saliency detection via sparse representation is one of the newly arisen approaches. However, most existing sparse representation-based saliency detection methods utilize partial characteristics of sparse representation, lacking of in-depth analysis. Thus, they may have limited detection performance. Motivated by this, this paper proposes an algorithm for detecting visual saliency based on in-depth analysis of sparse representation. A number of discriminative dictionaries are first learned with randomly sampled image patches by means of inner product-based dictionary atom classification. Then, the input image is partitioned into many image patches, and these patches are classified into salient and nonsalient ones based on the in-depth analysis of sparse coding coefficients. Afterward, sparse reconstruction errors are calculated for the salient and nonsalient patch sets. By investigating the sparse reconstruction errors, the most salient atoms, which tend to be from the most salient region, are screened out and taken away from the discriminative dictionaries. Finally, an effective method is exploited for saliency map generation with the reduced dictionaries. Comprehensive evaluations on publicly available datasets and comparisons with some state-of-the-art approaches demonstrate the effectiveness of the proposed algorithm.

  2. Complexity analysis based on generalized deviation for financial markets

    NASA Astrophysics Data System (ADS)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  3. Content-Based Analysis of Bumper Stickers in Jordan

    ERIC Educational Resources Information Center

    Jaradat, Abdullah A.

    2016-01-01

    This study has set out to investigate bumper stickers in Jordan focusing mainly on the themes of the stickers. The study hypothesized that bumper stickers in Jordan reflect a wide range of topics including social, economic, and political. Due to being the first study of this phenomenon, the study has adopted content-based analysis to determine the…

  4. Validation of a new method for finding the rotational axes of the knee using both marker-based roentgen stereophotogrammetric analysis and 3D video-based motion analysis for kinematic measurements.

    PubMed

    Roland, Michelle; Hull, M L; Howell, S M

    2011-05-01

    In a previous paper, we reported the virtual axis finder, which is a new method for finding the rotational axes of the knee. The virtual axis finder was validated through simulations that were subject to limitations. Hence, the objective of the present study was to perform a mechanical validation with two measurement modalities: 3D video-based motion analysis and marker-based roentgen stereophotogrammetric analysis (RSA). A two rotational axis mechanism was developed, which simulated internal-external (or longitudinal) and flexion-extension (FE) rotations. The actual axes of rotation were known with respect to motion analysis and RSA markers within ± 0.0006 deg and ± 0.036 mm and ± 0.0001 deg and ± 0.016 mm, respectively. The orientation and position root mean squared errors for identifying the longitudinal rotation (LR) and FE axes with video-based motion analysis (0.26 deg, 0.28 m, 0.36 deg, and 0.25 mm, respectively) were smaller than with RSA (1.04 deg, 0.84 mm, 0.82 deg, and 0.32 mm, respectively). The random error or precision in the orientation and position was significantly better (p=0.01 and p=0.02, respectively) in identifying the LR axis with video-based motion analysis (0.23 deg and 0.24 mm) than with RSA (0.95 deg and 0.76 mm). There was no significant difference in the bias errors between measurement modalities. In comparing the mechanical validations to virtual validations, the virtual validations produced comparable errors to those of the mechanical validation. The only significant difference between the errors of the mechanical and virtual validations was the precision in the position of the LR axis while simulating video-based motion analysis (0.24 mm and 0.78 mm, p=0.019). These results indicate that video-based motion analysis with the equipment used in this study is the superior measurement modality for use with the virtual axis finder but both measurement modalities produce satisfactory results. The lack of significant differences between

  5. Seahawk: moving beyond HTML in Web-based bioinformatics analysis.

    PubMed

    Gordon, Paul M K; Sensen, Christoph W

    2007-06-18

    Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therefore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer.

  6. Seahawk: moving beyond HTML in Web-based bioinformatics analysis

    PubMed Central

    Gordon, Paul MK; Sensen, Christoph W

    2007-01-01

    Background Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. Results We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. Conclusion As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer. PMID:17577405

  7. [Procedural analysis of acid-base balance disorder: case serials in 4 patents].

    PubMed

    Ma, Chunyuan; Wang, Guijie

    2017-05-01

    To establish the standardization process of acid-base balance analysis, analyze cases of acid-base balance disorder with the aid of acid-base balance coordinate graph. The acid-base balance theory were reviewed systematically on recent research progress, and the important concepts, definitions, formulas, parameters, regularity and inference in the analysis of acid-base balance were studied. The analysis of acid-base balance disordered processes and steps were figured. The application of acid-base balance coordinate graph in the cases was introduced. The method of "four parameters-four steps" analysis was put forward to analyze the acid-base balance disorders completely. "Four parameters" included pH, arterial partial pressure of carbon dioxide (PaCO 2 ), HCO 3 - and anion gap (AG). "Four steps" were outlined by following aspects: (1) according to the pH, PaCO 2 and HCO 3 - , the primary or main types of acid-base balance disorder was determined; (2) primary or main types of acid-base disorder were used to choose the appropriate compensation formula and to determine the presence of double mixed acid-base balance disorder; (3) the primary acid-base balance disorders were divided into two parts: respiratory acidosis or respiratory alkalosis, at the same time, the potential HCO 3 - should be calculated, the measured HCO 3 - should be replaced with potential HCO 3 - , to determine whether there were three mixed acid-base disorders; (4) based on the above analysis the data judged as the simple AG increased-metabolic acidosis was needed to be further analyzed. The ratio of ΔAG↑/ΔHCO 3 - ↓ was also needed to be calculated, to determine whether there was normal AG metabolic acidosis or metabolic alkalosis. In the clinical practice, PaCO 2 (as the abscissa) and HCO 3 - (as the ordinate) were used to establish a rectangular coordinate system, through origin (0, 0) and coordinate point (40, 24) could be a straight line, and all points on the straight line pH were equal

  8. Droplet-based microfluidic analysis and screening of single plant cells.

    PubMed

    Yu, Ziyi; Boehm, Christian R; Hibberd, Julian M; Abell, Chris; Haseloff, Jim; Burgess, Steven J; Reyna-Llorens, Ivan

    2018-01-01

    Droplet-based microfluidics has been used to facilitate high-throughput analysis of individual prokaryote and mammalian cells. However, there is a scarcity of similar workflows applicable to rapid phenotyping of plant systems where phenotyping analyses typically are time-consuming and low-throughput. We report on-chip encapsulation and analysis of protoplasts isolated from the emergent plant model Marchantia polymorpha at processing rates of >100,000 cells per hour. We use our microfluidic system to quantify the stochastic properties of a heat-inducible promoter across a population of transgenic protoplasts to demonstrate its potential for assessing gene expression activity in response to environmental conditions. We further demonstrate on-chip sorting of droplets containing YFP-expressing protoplasts from wild type cells using dielectrophoresis force. This work opens the door to droplet-based microfluidic analysis of plant cells for applications ranging from high-throughput characterisation of DNA parts to single-cell genomics to selection of rare plant phenotypes.

  9. Conjoint analysis: using a market-based research model for healthcare decision making.

    PubMed

    Mele, Nancy L

    2008-01-01

    Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.

  10. Dynamic Network-Based Epistasis Analysis: Boolean Examples

    PubMed Central

    Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.

    2011-01-01

    In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and

  11. Reassessment of the hairy long-nosed armadillo "Dasypus" pilosus (Xenarthra, Dasypodidae) and revalidation of the genus Cryptophractus Fitzinger, 1856.

    PubMed

    Castro, Mariela C; Ciancio, Martín R; Pacheco, Víctor; Salas-Gismondi, Rodolfo M; Bostelmann, J Enrique; Carlini, Alfredo A

    2015-04-14

    The hairy long-nosed armadillo, currently referred as Dasypus (Cryptophractus) pilosus, is an enigmatic species endemic to montane cloud forests and subparamo of Peruvian Andes. Its strikingly different external features, which include the carapace concealed by abundant hair, the presence of more movable bands, and a slender skull, have raised questions regarding its taxonomic status as subgenus or as genus. This paper assesses this issue based on a cladistic study and provides a detailed comparative description of the species, including the first account on the distinctive ornamentation of its osteoderms. Based on several unique characters in the carapace, skull, mandible, and teeth, as well as on the external phylogenetic position relative to other Dasypus, we favor the assignment of the hairy long-nosed armadillo to other genus. As result, we revalidate the original generic epithet, so that the valid name of the species is Cryptophractus pilosus Fitzinger, 1856.

  12. Web-Based Analysis and Publication of Flow Cytometry Experiments

    PubMed Central

    Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.

    2014-01-01

    Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106

  13. Web-based analysis and publication of flow cytometry experiments.

    PubMed

    Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M

    2010-07-01

    Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.

  14. Geographic Object-Based Image Analysis – Towards a new paradigm

    PubMed Central

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  15. Content-based TV sports video retrieval using multimodal analysis

    NASA Astrophysics Data System (ADS)

    Yu, Yiqing; Liu, Huayong; Wang, Hongbin; Zhou, Dongru

    2003-09-01

    In this paper, we propose content-based video retrieval, which is a kind of retrieval by its semantical contents. Because video data is composed of multimodal information streams such as video, auditory and textual streams, we describe a strategy of using multimodal analysis for automatic parsing sports video. The paper first defines the basic structure of sports video database system, and then introduces a new approach that integrates visual stream analysis, speech recognition, speech signal processing and text extraction to realize video retrieval. The experimental results for TV sports video of football games indicate that the multimodal analysis is effective for video retrieval by quickly browsing tree-like video clips or inputting keywords within predefined domain.

  16. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...

  17. A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.

    PubMed

    Gupta, Omesh P; Brown, Gary C; Brown, Melissa M

    2008-05-01

    To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.

  18. Analysis of swimming performance: perceptions and practices of US-based swimming coaches.

    PubMed

    Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid

    2016-01-01

    In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.

  19. Game-based digital interventions for depression therapy: a systematic review and meta-analysis.

    PubMed

    Li, Jinhui; Theng, Yin-Leng; Foo, Schubert

    2014-08-01

    The aim of this study was to review the existing literature on game-based digital interventions for depression systematically and examine their effectiveness through a meta-analysis of randomized controlled trials (RCTs). Database searching was conducted using specific search terms and inclusion criteria. A standard meta-analysis was also conducted of available RCT studies with a random effects model. The standard mean difference (Cohen's d) was used to calculate the effect size of each study. Nineteen studies were included in the review, and 10 RCTs (eight studies) were included in the meta-analysis. Four types of game interventions-psycho-education and training, virtual reality exposure therapy, exercising, and entertainment-were identified, with various types of support delivered and populations targeted. The meta-analysis revealed a moderate effect size of the game interventions for depression therapy at posttreatment (d=-0.47 [95% CI -0.69 to -0.24]). A subgroup analysis showed that interventions based on psycho-education and training had a smaller effect than those based on the other forms, and that self-help interventions yielded better outcomes than supported interventions. A higher effect was achieved when a waiting list was used as the control. The review and meta-analysis support the effectiveness of game-based digital interventions for depression. More large-scale, high-quality RCT studies with sufficient long-term data for treatment evaluation are needed.

  20. BASE Flexible Array Preliminary Lithospheric Structure Analysis

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Sheehan, A. F.; Anderson, M. L.; Siddoway, C. S.; Erslev, E.; Harder, S. H.; Miller, K. C.

    2009-12-01

    The Bighorns Arch Seismic Experiment (BASE) is a Flexible Array experiment integrated with EarthScope. The goal of BASE is to develop a better understanding of how basement-involved foreland arches form and what their link is to plate tectonic processes. To achieve this goal, the crustal structure under the Bighorn Mountain range, Bighorn Basin, and Powder River Basin of northern Wyoming and southern Montana are investigated through the deployment of 35 broadband seismometers, 200 short period seismometers, 1600 “Texan” instruments using active sources and 800 “Texan” instruments monitoring passive sources, together with field structural analysis of brittle structures. The novel combination of these approaches and anticipated simultaneous data inversion will give a detailed structural crustal image of the Bighorn region at all levels of the crust. Four models have been proposed for the formation of the Bighorn foreland arch: subhorizontal detachment within the crust, lithospheric buckling, pure shear lithospheric thickening, and fault blocks defined by lithosphere-penetrating thrust faults. During the summer of 2009, we deployed 35 broadband instruments, which have already recorded several magnitude 7+ teleseismic events. Through P wave receiver function analysis of these 35 stations folded in with many EarthScope Transportable Array stations in the region, we present a preliminary map of the Mohorovicic discontinuity. This crustal map is our first test of how the unique Moho geometries predicted by the four hypothesized models of basement involved arches fit seismic observations for the Bighorn Mountains. In addition, shear-wave splitting analysis for our first few recorded teleseisms helps us determine if strong lithospheric deformation is preserved under the range. These analyses help lead us to our final goal, a complete 4D (3D spatial plus temporal) lithospheric-scale model of arch formation which will advance our understanding of the mechanisms

  1. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...

  2. Simulation-based bronchoscopy training: systematic review and meta-analysis.

    PubMed

    Kennedy, Cassie C; Maldonado, Fabien; Cook, David A

    2013-07-01

    Simulation-based bronchoscopy training is increasingly used, but effectiveness remains uncertain. We sought to perform a comprehensive synthesis of published work on simulation-based bronchoscopy training. We searched MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus for eligible articles through May 11, 2011. We included all original studies involving health professionals that evaluated, in comparison with no intervention or an alternative instructional approach, simulation-based training for flexible or rigid bronchoscopy. Study selection and data abstraction were performed independently and in duplicate. We pooled results using random effects meta-analysis. From an initial pool of 10,903 articles, we identified 17 studies evaluating simulation-based bronchoscopy training. In comparison with no intervention, simulation training was associated with large benefits on skills and behaviors (pooled effect size, 1.21 [95% CI, 0.82-1.60]; n=8 studies) and moderate benefits on time (0.62 [95% CI, 0.12-1.13]; n=7). In comparison with clinical instruction, behaviors with real patients showed nonsignificant effects favoring simulation for time (0.61 [95% CI, -1.47 to 2.69]) and process (0.33 [95% CI, -1.46 to 2.11]) outcomes (n=2 studies each), although variation in training time might account for these differences. Four studies compared alternate simulation-based training approaches. Inductive analysis to inform instructional design suggested that longer or more structured training is more effective, authentic clinical context adds value, and animal models and plastic part-task models may be superior to more costly virtual-reality simulators. Simulation-based bronchoscopy training is effective in comparison with no intervention. Comparative effectiveness studies are few.

  3. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  4. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    PubMed

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters.

  5. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  6. An object-based approach to weather analysis and its applications

    NASA Astrophysics Data System (ADS)

    Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew

    2013-04-01

    The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate

  7. Skeletal maturity determination from hand radiograph by model-based analysis

    NASA Astrophysics Data System (ADS)

    Vogelsang, Frank; Kohnen, Michael; Schneider, Hansgerd; Weiler, Frank; Kilbinger, Markus W.; Wein, Berthold B.; Guenther, Rolf W.

    2000-06-01

    Derived from a model based segmentation algorithm for hand radiographs proposed in our former work we now present a method to determine skeletal maturity by an automated analysis of regions of interest (ROI). These ROIs including the epiphyseal and carpal bones, which are most important for skeletal maturity determination, can be extracted out of the radiograph by knowledge based algorithms.

  8. Multi-membership gene regulation in pathway based microarray analysis

    PubMed Central

    2011-01-01

    Background Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. Results We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. Conclusions We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes. PMID:21939531

  9. Multi-membership gene regulation in pathway based microarray analysis.

    PubMed

    Pavlidis, Stelios P; Payne, Annette M; Swift, Stephen M

    2011-09-22

    Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes.

  10. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...

  11. USARCENT AOR Contingency Base Waste Stream Analysis: An Analysis of Solid Waste Streams at Five Bases in the U. S. Army Central (USARCENT) Area of Responsibility

    DTIC Science & Technology

    2013-03-31

    certainly remain comingled with other solid waste. For example, some bases provided containers for segregation of recyclables including plastic and...prevalent types of solid waste are food (19.1% by average sample weight), wood (18.9%), and plastics (16.0%) based on analysis of bases in...within the interval shown. Food and wood wastes are the largest components of the average waste stream (both at ~19% by weight), followed by plastic

  12. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  13. Volumetric quantification of bone-implant contact using micro-computed tomography analysis based on region-based segmentation

    PubMed Central

    Kang, Sung-Won; Lee, Woo-Jin; Choi, Soon-Chul; Lee, Sam-Sun; Heo, Min-Suk; Huh, Kyung-Hoe

    2015-01-01

    Purpose We have developed a new method of segmenting the areas of absorbable implants and bone using region-based segmentation of micro-computed tomography (micro-CT) images, which allowed us to quantify volumetric bone-implant contact (VBIC) and volumetric absorption (VA). Materials and Methods The simple threshold technique generally used in micro-CT analysis cannot be used to segment the areas of absorbable implants and bone. Instead, a region-based segmentation method, a region-labeling method, and subsequent morphological operations were successively applied to micro-CT images. The three-dimensional VBIC and VA of the absorbable implant were then calculated over the entire volume of the implant. Two-dimensional (2D) bone-implant contact (BIC) and bone area (BA) were also measured based on the conventional histomorphometric method. Results VA and VBIC increased significantly with as the healing period increased (p<0.05). VBIC values were significantly correlated with VA values (p<0.05) and with 2D BIC values (p<0.05). Conclusion It is possible to quantify VBIC and VA for absorbable implants using micro-CT analysis using a region-based segmentation method. PMID:25793178

  14. Graph-based normalization and whitening for non-linear data analysis.

    PubMed

    Aaron, Catherine

    2006-01-01

    In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.

  15. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  16. Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation

    NASA Technical Reports Server (NTRS)

    Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.

    1998-01-01

    The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.

  17. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  18. Weighted functional linear regression models for gene-based association analysis.

    PubMed

    Belonogova, Nadezhda M; Svishcheva, Gulnara R; Wilson, James F; Campbell, Harry; Axenovich, Tatiana I

    2018-01-01

    Functional linear regression models are effectively used in gene-based association analysis of complex traits. These models combine information about individual genetic variants, taking into account their positions and reducing the influence of noise and/or observation errors. To increase the power of methods, where several differently informative components are combined, weights are introduced to give the advantage to more informative components. Allele-specific weights have been introduced to collapsing and kernel-based approaches to gene-based association analysis. Here we have for the first time introduced weights to functional linear regression models adapted for both independent and family samples. Using data simulated on the basis of GAW17 genotypes and weights defined by allele frequencies via the beta distribution, we demonstrated that type I errors correspond to declared values and that increasing the weights of causal variants allows the power of functional linear models to be increased. We applied the new method to real data on blood pressure from the ORCADES sample. Five of the six known genes with P < 0.1 in at least one analysis had lower P values with weighted models. Moreover, we found an association between diastolic blood pressure and the VMP1 gene (P = 8.18×10-6), when we used a weighted functional model. For this gene, the unweighted functional and weighted kernel-based models had P = 0.004 and 0.006, respectively. The new method has been implemented in the program package FREGAT, which is freely available at https://cran.r-project.org/web/packages/FREGAT/index.html.

  19. Visualization-based analysis of multiple response survey data

    NASA Astrophysics Data System (ADS)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  20. An analysis of dinosaurian biogeography: evidence for the existence of vicariance and dispersal patterns caused by geological events.

    PubMed

    Upchurch, Paul; Hunn, Craig A; Norman, David B

    2002-03-22

    As the supercontinent Pangaea fragmented during the Mesozoic era, dinosaur faunas were divided into isolated populations living on separate continents. It has been predicted, therefore, that dinosaur distributions should display a branching ('vicariance') pattern that corresponds with the sequence and timing of continental break-up. Several recent studies, however, minimize the importance of plate tectonics and instead suggest that dispersal and regional extinction were the main controls on dinosaur biogeography. Here, in order to test the vicariance hypothesis, we apply a cladistic biogeographical method to a large dataset on dinosaur relationships and distributions. We also introduce a methodological refinement termed 'time-slicing', which is shown to be a key step in the detection of ancient biogeographical patterns. These analyses reveal biogeographical patterns that closely correlate with palaeogeography. The results provide the first statistically robust evidence that, from Middle Jurassic to mid-Cretaceous times, tectonic events had a major role in determining where and when particular dinosaur groups flourished. The fact that evolutionary trees for extinct organisms preserve such distribution patterns opens up a new and fruitful direction for palaeobiogeographical research.

  1. A Pilot Meta-Analysis of Computer-Based Scaffolding in STEM Education

    ERIC Educational Resources Information Center

    Belland, Brian R.; Walker, Andrew E.; Olsen, Megan Whitney; Leary, Heather

    2015-01-01

    This paper employs meta-analysis to determine the influence of computer-based scaffolding characteristics and study and test score quality on cognitive outcomes in science, technology, engineering, and mathematics education at the secondary, college, graduate, and adult levels. Results indicate that (a) computer-based scaffolding positively…

  2. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

  3. Annotation Graphs: A Graph-Based Visualization for Meta-Analysis of Data Based on User-Authored Annotations.

    PubMed

    Zhao, Jian; Glueck, Michael; Breslav, Simon; Chevalier, Fanny; Khan, Azam

    2017-01-01

    User-authored annotations of data can support analysts in the activity of hypothesis generation and sensemaking, where it is not only critical to document key observations, but also to communicate insights between analysts. We present annotation graphs, a dynamic graph visualization that enables meta-analysis of data based on user-authored annotations. The annotation graph topology encodes annotation semantics, which describe the content of and relations between data selections, comments, and tags. We present a mixed-initiative approach to graph layout that integrates an analyst's manual manipulations with an automatic method based on similarity inferred from the annotation semantics. Various visual graph layout styles reveal different perspectives on the annotation semantics. Annotation graphs are implemented within C8, a system that supports authoring annotations during exploratory analysis of a dataset. We apply principles of Exploratory Sequential Data Analysis (ESDA) in designing C8, and further link these to an existing task typology in the visualization literature. We develop and evaluate the system through an iterative user-centered design process with three experts, situated in the domain of analyzing HCI experiment data. The results suggest that annotation graphs are effective as a method of visually extending user-authored annotations to data meta-analysis for discovery and organization of ideas.

  4. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  5. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis.

    PubMed

    Garling, Eric H; Kaptein, Bart L; Geleijns, Koos; Nelissen, Rob G H H; Valstar, Edward R

    2005-04-01

    It remains unknown if and how the polyethylene bearing in mobile bearing knees moves during dynamic activities with respect to the tibial base plate. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis (MCM-based RFA) uses a marker configuration model of inserted tantalum markers in order to accurately estimate the pose of an implant or bone using single plane Roentgen images or fluoroscopic images. The goal of this study is to assess the accuracy of (MCM-Based RFA) in a standard fluoroscopic set-up using phantom experiments and to determine the error propagation with computer simulations. The experimental set-up of the phantom study was calibrated using a calibration box equipped with 600 tantalum markers, which corrected for image distortion and determined the focus position. In the computer simulation study the influence of image distortion, MC-model accuracy, focus position, the relative distance between MC-models and MC-model configuration on the accuracy of MCM-Based RFA were assessed. The phantom study established that the in-plane accuracy of MCM-Based RFA is 0.1 mm and the out-of-plane accuracy is 0.9 mm. The rotational accuracy is 0.1 degrees. A ninth-order polynomial model was used to correct for image distortion. Marker-Based RFA was estimated to have, in a worst case scenario, an in vivo translational accuracy of 0.14 mm (x-axis), 0.17 mm (y-axis), 1.9 mm (z-axis), respectively, and a rotational accuracy of 0.3 degrees. When using fluoroscopy to study kinematics, image distortion and the accuracy of models are important factors, which influence the accuracy of the measurements. MCM-Based RFA has the potential to be an accurate, clinically useful tool for studying kinematics after total joint replacement using standard equipment.

  6. Game-Based Digital Interventions for Depression Therapy: A Systematic Review and Meta-Analysis

    PubMed Central

    Theng, Yin-Leng; Foo, Schubert

    2014-01-01

    Abstract The aim of this study was to review the existing literature on game-based digital interventions for depression systematically and examine their effectiveness through a meta-analysis of randomized controlled trials (RCTs). Database searching was conducted using specific search terms and inclusion criteria. A standard meta-analysis was also conducted of available RCT studies with a random effects model. The standard mean difference (Cohen's d) was used to calculate the effect size of each study. Nineteen studies were included in the review, and 10 RCTs (eight studies) were included in the meta-analysis. Four types of game interventions—psycho-education and training, virtual reality exposure therapy, exercising, and entertainment—were identified, with various types of support delivered and populations targeted. The meta-analysis revealed a moderate effect size of the game interventions for depression therapy at posttreatment (d=−0.47 [95% CI −0.69 to −0.24]). A subgroup analysis showed that interventions based on psycho-education and training had a smaller effect than those based on the other forms, and that self-help interventions yielded better outcomes than supported interventions. A higher effect was achieved when a waiting list was used as the control. The review and meta-analysis support the effectiveness of game-based digital interventions for depression. More large-scale, high-quality RCT studies with sufficient long-term data for treatment evaluation are needed. PMID:24810933

  7. CLUSFAVOR 5.0: hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles

    PubMed Central

    Peterson, Leif E

    2002-01-01

    CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816

  8. Linear discriminant analysis based on L1-norm maximization.

    PubMed

    Zhong, Fujin; Zhang, Jiashu

    2013-08-01

    Linear discriminant analysis (LDA) is a well-known dimensionality reduction technique, which is widely used for many purposes. However, conventional LDA is sensitive to outliers because its objective function is based on the distance criterion using L2-norm. This paper proposes a simple but effective robust LDA version based on L1-norm maximization, which learns a set of local optimal projection vectors by maximizing the ratio of the L1-norm-based between-class dispersion and the L1-norm-based within-class dispersion. The proposed method is theoretically proved to be feasible and robust to outliers while overcoming the singular problem of the within-class scatter matrix for conventional LDA. Experiments on artificial datasets, standard classification datasets and three popular image databases demonstrate the efficacy of the proposed method.

  9. A phylogeny of Astyanax (Characiformes: Characidae) in Central and North America.

    PubMed

    Schmitter-Soto, Juan J

    2016-05-06

    A phylogeny is presented for 34 species of Astyanax, 27 of them once included within A. aeneus or A. fasciatus in Central America and Mexico, based on 52 morphological characters (mostly osteological, but also pigmentation and meristics), with three outgroups. Monophyly is not supported for A. aeneus s. lat., as Brazilian species such as A. fasciatus s. str. and others occur also within that clade. There were only five resolved clades, three of them including both Brazilian and Central American species, one purely Nicaraguan, and one for central-northern Mexico and Texas. Coincidence with previous cladistic hypotheses is only partial. The genus Bramocharax Gill is not recovered, and thus confirmed as a synonym of Astyanax Baird & Girard. The findings point at a more complex biogeographic history of the region than usually recognized.

  10. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  11. Feature-space-based FMRI analysis using the optimal linear transformation.

    PubMed

    Sun, Fengrong; Morris, Drew; Lee, Wayne; Taylor, Margot J; Mills, Travis; Babyn, Paul S

    2010-09-01

    The optimal linear transformation (OLT), an image analysis technique of feature space, was first presented in the field of MRI. This paper proposes a method of extending OLT from MRI to functional MRI (fMRI) to improve the activation-detection performance over conventional approaches of fMRI analysis. In this method, first, ideal hemodynamic response time series for different stimuli were generated by convolving the theoretical hemodynamic response model with the stimulus timing. Second, constructing hypothetical signature vectors for different activity patterns of interest by virtue of the ideal hemodynamic responses, OLT was used to extract features of fMRI data. The resultant feature space had particular geometric clustering properties. It was then classified into different groups, each pertaining to an activity pattern of interest; the applied signature vector for each group was obtained by averaging. Third, using the applied signature vectors, OLT was applied again to generate fMRI composite images with high SNRs for the desired activity patterns. Simulations and a blocked fMRI experiment were employed for the method to be verified and compared with the general linear model (GLM)-based analysis. The simulation studies and the experimental results indicated the superiority of the proposed method over the GLM-based analysis in detecting brain activities.

  12. Job optimization in ATLAS TAG-based distributed analysis

    NASA Astrophysics Data System (ADS)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  13. Osteological evidence for sister group relationship between pseudo-toothed birds (Aves: Odontopterygiformes) and waterfowls (Anseriformes)

    NASA Astrophysics Data System (ADS)

    Bourdon, Estelle

    2005-12-01

    The phylogenetic affinities of the extinct pseudo-toothed birds have remained controversial. Some authors noted that they resemble both pelicans and allies (Pelecaniformes) and tube-nosed birds (Procellariiformes), but assigned them to a distinct taxon, the Odontopterygiformes. In most recent studies, the pseudo-toothed birds are referred to the family Pelagornithidae inside the Pelecaniformes. Here, I perform a cladistic analysis with five taxa of the pseudo-toothed birds including two undescribed new species from the Early Tertiary of Morocco. The present hypothesis strongly supports a sister group relationship of pseudo-toothed birds (Odontopterygiformes) and waterfowls (Anseriformes). The Odontoanserae (Odontopterygiformes plus Anseriformes) are the sister group of Neoaves. The placement of the landfowls (Galliformes) as the sister taxon of all other neognathous birds does not support the consensus view that the Galloanserae (Galliformes plus Anseriformes) are monophyletic.

  14. New species and phylogenetic relationships of the spider genus Coptoprepes using morphological and sequence data (Araneae: Anyphaenidae).

    PubMed

    Barone, Mariana L; Werenkraut, Victoria; Ramírez, Martín J

    2016-10-17

    We present evidence from the standard cytochrome c oxidase subunit I (COI) barcoding marker and from new collections, showing that the males and females of C. ecotono Werenkraut & Ramírez were mismatched, and describe the female of that species for the first time. An undescribed male from Chile is assigned to the new species Coptoprepes laudani, together with the female that was previously thought as C. ecotono. The matching of sexes is justified after a dual cladistics analysis of morphological and sequence data in combination. New locality data and barcoding sequences are provided for other species of Coptoprepes, all endemic of the temperate forests of Chile and adjacent Argentina. Although morphology and sequences are not conclusive on the relationships of Coptoprepes species, the sequence data suggests that the species without a retrolateral tibial apophysis may belong to an independent lineage.

  15. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    PubMed Central

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  16. MGAS: a powerful tool for multivariate gene-based genome-wide association analysis.

    PubMed

    Van der Sluis, Sophie; Dolan, Conor V; Li, Jiang; Song, Youqiang; Sham, Pak; Posthuma, Danielle; Li, Miao-Xin

    2015-04-01

    Standard genome-wide association studies, testing the association between one phenotype and a large number of single nucleotide polymorphisms (SNPs), are limited in two ways: (i) traits are often multivariate, and analysis of composite scores entails loss in statistical power and (ii) gene-based analyses may be preferred, e.g. to decrease the multiple testing problem. Here we present a new method, multivariate gene-based association test by extended Simes procedure (MGAS), that allows gene-based testing of multivariate phenotypes in unrelated individuals. Through extensive simulation, we show that under most trait-generating genotype-phenotype models MGAS has superior statistical power to detect associated genes compared with gene-based analyses of univariate phenotypic composite scores (i.e. GATES, multiple regression), and multivariate analysis of variance (MANOVA). Re-analysis of metabolic data revealed 32 False Discovery Rate controlled genome-wide significant genes, and 12 regions harboring multiple genes; of these 44 regions, 30 were not reported in the original analysis. MGAS allows researchers to conduct their multivariate gene-based analyses efficiently, and without the loss of power that is often associated with an incorrectly specified genotype-phenotype models. MGAS is freely available in KGG v3.0 (http://statgenpro.psychiatry.hku.hk/limx/kgg/download.php). Access to the metabolic dataset can be requested at dbGaP (https://dbgap.ncbi.nlm.nih.gov/). The R-simulation code is available from http://ctglab.nl/people/sophie_van_der_sluis. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  17. [Why evidence-based medicine? 20 years of meta-analysis].

    PubMed

    Ceballos, C; Valdizán, J R; Artal, A; Almárcegui, C; Allepuz, C; García Campayo, J; Fernández Liesa, R; Giraldo, P; Puértolas, T

    2000-10-01

    Meta-analysis, described within evidence-based medicine, has become a frequent issue in recent medical literature. An exhaustive search of reported meta-analysis from any medical specialty is described. Search of papers included in Medline or Embase between 1973-1998. A study of intra and inter-reviewers liability about selection and classification have been performed. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported meta-analysis papers by medical specialty and year. 1,518 papers were selected and classified. Most frequently found (45.91%) were: methodology (15.7%), psychiatry (11.79%), cardiology (10.01%) and oncology (8.36%). Inter personal agreement was 0.93 in selecting papers and 0.72 in classifying them. Between 1977-1987 overall mean of reported studies of meta-analysis (1.67 + 4.10) was significatively inferior to the 1988-1998 (49.54 + 56.55) (p < 0.001). Global number of meta-analysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. The method used to identify meta-analysis reports can be considered to be adequate; however, the agreement in classifying them in medical specialties was inferior. A progressive increase in the number of reported meta-analysis since 1977 can be demonstrated. Specialties with a greater number of meta-analysis published in the literature were: psychiatry, oncology and cardiology. Diffusion of knowledge about fundamentals and methodology of meta-analysis seems to have drawn and increase in performing and reporting this kind of analysis.

  18. [Application of the computer-based respiratory sound analysis system based on Mel-frequency cepstral coefficient and dynamic time warping in healthy children].

    PubMed

    Yan, W Y; Li, L; Yang, Y G; Lin, X L; Wu, J Z

    2016-08-01

    We designed a computer-based respiratory sound analysis system to identify pediatric normal lung sound. To verify the validity of the computer-based respiratory sound analysis system. First we downloaded the standard lung sounds from the network database (website: http: //www.easyauscultation.com/lung-sounds-reference-guide) and recorded 3 samples of abnormal loud sound (rhonchi, wheeze and crackles) from three patients of The Department of Pediatrics, the First Affiliated Hospital of Xiamen University. We regarded such lung sounds as"reference lung sounds". The"test lung sounds"were recorded from 29 children form Kindergarten of Xiamen University. we recorded lung sound by portable electronic stethoscope and valid lung sounds were selected by manual identification. We introduced Mel-frequency cepstral coefficient (MFCC) to extract lung sound features and dynamic time warping (DTW) for signal classification. We had 39 standard lung sounds, recorded 58 test lung sounds. This computer-based respiratory sound analysis system was carried out in 58 lung sound recognition, correct identification of 52 times, error identification 6 times. Accuracy was 89.7%. Based on MFCC and DTW, our computer-based respiratory sound analysis system can effectively identify healthy lung sounds of children (accuracy can reach 89.7%), fully embodies the reliability of the lung sounds analysis system.

  19. Inquiry-based Laboratory Activities on Drugs Analysis for High School Chemistry Learning

    NASA Astrophysics Data System (ADS)

    Rahmawati, I.; Sholichin, H.; Arifin, M.

    2017-09-01

    Laboratory activity is an important part of chemistry learning, but cookbook instructions is still commonly used. However, the activity with that way do not improve students thinking skill, especially students creativity. This study aims to improve high school students creativity through inquiry-based laboratory on drugs analysis activity. Acid-base titration is used to be method for drugs analysis involving a color changing indicator. The following tools were used to assess the activity achievement: creative thinking test on acid base titration, creative attitude and action observation sheets, questionnaire of inquiry-based lab activities, and interviews. The results showed that the inquiry-based laboratory activity improving students creative thinking, creative attitude and creative action. The students reacted positively to this teaching strategy as demonstrated by results from questionnaire responses and interviews. This result is expected to help teachers to overcome the shortcomings in other laboratory learning.

  20. A method for data base management and analysis for wind tunnel data

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1987-01-01

    To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.

  1. Derivation of Continuum Models from An Agent-based Cancer Model: Optimization and Sensitivity Analysis.

    PubMed

    Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank

    2017-01-01

    Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Morphological cladistic analysis of eight popular Olive (Olea europaea L.) cultivars grown in Saudi Arabia using Numerical Taxonomic System for personal computer to detect phyletic relationship and their proximate fruit composition

    PubMed Central

    Al-Ruqaie, I.; Al-Khalifah, N.S.; Shanavaskhan, A.E.

    2015-01-01

    Varietal identification of olives is an intrinsic and empirical exercise owing to the large number of synonyms and homonyms, intensive exchange of genotypes, presence of varietal clones and lack of proper certification in nurseries. A comparative study of morphological characters of eight olive cultivars grown in Saudi Arabia was carried out and analyzed using NTSYSpc (Numerical Taxonomy System for personal computer) system segregated smaller fruits in one clade and the rest in two clades. Koroneiki, a Greek cultivar with a small sized fruit shared arm with Spanish variety Arbosana. Morphologic analysis using NTSYSpc revealed that biometrics of leaves, fruits and seeds are reliable morphologic characters to distinguish between varieties, except for a few morphologically very similar olive cultivars. The proximate analysis showed significant variations in the protein, fiber, crude fat, ash and moisture content of different cultivars. The study also showed that neither the size of fruit nor the fruit pulp thickness is a limiting factor determining crude fat content of olives. PMID:26858547

  3. Independent Component Analysis applied to Ground-based observations

    NASA Astrophysics Data System (ADS)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  4. Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders

    PubMed Central

    Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini

    2008-01-01

    Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693

  5. GPU-based acceleration of computations in nonlinear finite element deformation analysis.

    PubMed

    Mafi, Ramin; Sirouspour, Shahin

    2014-03-01

    The physics of deformation for biological soft-tissue is best described by nonlinear continuum mechanics-based models, which then can be discretized by the FEM for a numerical solution. However, computational complexity of such models have limited their use in applications requiring real-time or fast response. In this work, we propose a graphic processing unit-based implementation of the FEM using implicit time integration for dynamic nonlinear deformation analysis. This is the most general formulation of the deformation analysis. It is valid for large deformations and strains and can account for material nonlinearities. The data-parallel nature and the intense arithmetic computations of nonlinear FEM equations make it particularly suitable for implementation on a parallel computing platform such as graphic processing unit. In this work, we present and compare two different designs based on the matrix-free and conventional preconditioned conjugate gradients algorithms for solving the FEM equations arising in deformation analysis. The speedup achieved with the proposed parallel implementations of the algorithms will be instrumental in the development of advanced surgical simulators and medical image registration methods involving soft-tissue deformation. Copyright © 2013 John Wiley & Sons, Ltd.

  6. a Region-Based Multi-Scale Approach for Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz Erdemir, M.; Tonbul, H.

    2016-06-01

    Within the last two decades, object-based image analysis (OBIA) considering objects (i.e. groups of pixels) instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights) to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC) graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse) determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient). Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  7. Geospatial analysis based on GIS integrated with LADAR.

    PubMed

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  8. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    PubMed

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  9. Physics-based deformable organisms for medical image analysis

    NASA Astrophysics Data System (ADS)

    Hamarneh, Ghassan; McIntosh, Chris

    2005-04-01

    Previously, "Deformable organisms" were introduced as a novel paradigm for medical image analysis that uses artificial life modelling concepts. Deformable organisms were designed to complement the classical bottom-up deformable models methodologies (geometrical and physical layers), with top-down intelligent deformation control mechanisms (behavioral and cognitive layers). However, a true physical layer was absent and in order to complete medical image segmentation tasks, deformable organisms relied on pure geometry-based shape deformations guided by sensory data, prior structural knowledge, and expert-generated schedules of behaviors. In this paper we introduce the use of physics-based shape deformations within the deformable organisms framework yielding additional robustness by allowing intuitive real-time user guidance and interaction when necessary. We present the results of applying our physics-based deformable organisms, with an underlying dynamic spring-mass mesh model, to segmenting and labelling the corpus callosum in 2D midsagittal magnetic resonance images.

  10. An activity-based methodology for operations cost analysis

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David; Bilby, Curt; Frizzell, R. A.

    1991-01-01

    This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.

  11. GOMA: functional enrichment analysis tool based on GO modules

    PubMed Central

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  12. The impact of chimerism in DNA-based forensic sex determination analysis.

    PubMed

    George, Renjith; Donald, Preethy Mary; Nagraj, Sumanth Kumbargere; Idiculla, Jose Joy; Hj Ismail, Rashid

    2013-01-01

    Sex determination is the most important step in personal identification in forensic investigations. DNA-based sex determination analysis is comparatively more reliable than the other conventional methods of sex determination analysis. Advanced technology like real-time polymerase chain reaction (PCR) offers accurate and reproducible results and is at the level of legal acceptance. But still there are situations like chimerism where an individual possess both male and female specific factors together in their body. Sex determination analysis in such cases can give erroneous results. This paper discusses the phenomenon of chimerism and its impact on sex determination analysis in forensic investigations.

  13. Coordinate based random effect size meta-analysis of neuroimaging studies.

    PubMed

    Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J

    2017-06-01

    Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  15. Utilizing Problem-Based Learning in Qualitative Analysis Lab Experiments

    ERIC Educational Resources Information Center

    Hicks, Randall W.; Bevsek, Holly M.

    2012-01-01

    A series of qualitative analysis (QA) laboratory experiments utilizing a problem-based learning (PBL) module has been designed and implemented. The module guided students through the experiments under the guise of cleaning up a potentially contaminated water site as employees of an environmental chemistry laboratory. The main goal was the…

  16. Self-adaptive relevance feedback based on multilevel image content analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yongying; Zhang, Yujin; Fu, Yu

    2001-01-01

    In current content-based image retrieval systems, it is generally accepted that obtaining high-level image features is a key to improve the querying. Among the related techniques, relevance feedback has become a hot research aspect because it combines the information from the user to refine the querying results. In practice, many methods have been proposed to achieve the goal of relevance feedback. In this paper, a new scheme for relevance feedback is proposed. Unlike previous methods for relevance feedback, our scheme provides a self-adaptive operation. First, based on multi- level image content analysis, the relevant images from the user could be automatically analyzed in different levels and the querying could be modified in terms of different analysis results. Secondly, to make it more convenient to the user, the procedure of relevance feedback could be led with memory or without memory. To test the performance of the proposed method, a practical semantic-based image retrieval system has been established, and the querying results gained by our self-adaptive relevance feedback are given.

  17. Self-adaptive relevance feedback based on multilevel image content analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yongying; Zhang, Yujin; Fu, Yu

    2000-12-01

    In current content-based image retrieval systems, it is generally accepted that obtaining high-level image features is a key to improve the querying. Among the related techniques, relevance feedback has become a hot research aspect because it combines the information from the user to refine the querying results. In practice, many methods have been proposed to achieve the goal of relevance feedback. In this paper, a new scheme for relevance feedback is proposed. Unlike previous methods for relevance feedback, our scheme provides a self-adaptive operation. First, based on multi- level image content analysis, the relevant images from the user could be automatically analyzed in different levels and the querying could be modified in terms of different analysis results. Secondly, to make it more convenient to the user, the procedure of relevance feedback could be led with memory or without memory. To test the performance of the proposed method, a practical semantic-based image retrieval system has been established, and the querying results gained by our self-adaptive relevance feedback are given.

  18. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support

    PubMed Central

    2010-01-01

    Background Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. Method This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. Results EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. Discussion This paper presents Eb

  19. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support.

    PubMed

    Gibert, Karina; García-Alonso, Carlos; Salvador-Carulla, Luis

    2010-09-30

    Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. This paper presents EbCA and shows the convenience of

  20. Students' Understanding of Acid, Base and Salt Reactions in Qualitative Analysis.

    ERIC Educational Resources Information Center

    Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.

    2003-01-01

    Uses a two-tier, multiple-choice diagnostic instrument to determine (n=915) grade 10 students' understanding of the acid, base, and salt reactions involved in basic qualitative analysis. Reports that many students did not understand the formation of precipitates and the complex salts, acid/salt-base reactions, and thermal decomposition involved in…

  1. The Giant Cretaceous Coelacanth (Actinistia, Sarcopterygii) Megalocoelacanthus dobiei Schwimmer, Stewart & Williams, 1994, and Its Bearing on Latimerioidei Interrelationships

    PubMed Central

    Dutel, Hugo; Maisey, John G.; Schwimmer, David R.; Janvier, Philippe; Herbin, Marc; Clément, Gaël

    2012-01-01

    We present a redescription of Megalocoelacanthus dobiei, a giant fossil coelacanth from Upper Cretaceous strata of North America. Megalocoelacanthus has been previously described on the basis of composite material that consisted of isolated elements. Consequently, many aspects of its anatomy have remained unknown as well as its phylogenetic relationships with other coelacanths. Previous studies have suggested that Megalocoelacanthus is closer to Latimeria and Macropoma than to Mawsonia. However, this assumption was based only on the overall similarity of few anatomical features, rather than on a phylogenetic character analysis. A new, and outstandingly preserved specimen from the Niobrara Formation in Kansas allows the detailed description of the skull of Megalocoelacanthus and elucidation of its phylogenetic relationships with other coelacanths. Although strongly flattened, the skull and jaws are well preserved and show many derived features that are shared with Latimeriidae such as Latimeria, Macropoma and Libys. Notably, the parietonasal shield is narrow and flanked by very large, continuous vacuities forming the supraorbital sensory line canal. Such an unusual morphology is also known in Libys. Some other features of Megalocoelacanthus, such as its large size and the absence of teeth are shared with the mawsoniid genera Mawsonia and Axelrodichthys. Our cladistic analysis supports the sister-group relationship of Megalocoelacanthus and Libys within Latimeriidae. This topology suggests that toothless, large-sized coelacanths evolved independently in both Latimeriidae and Mawsoniidae during the Mesozoic. Based on previous topologies and on ours, we then review the high-level taxonomy of Latimerioidei and propose new systematic phylogenetic definitions. PMID:23209614

  2. NURBS-Based Geometry for Integrated Structural Analysis

    NASA Technical Reports Server (NTRS)

    Oliver, James H.

    1997-01-01

    This grant was initiated in April 1993 and completed in September 1996. The primary goal of the project was to exploit the emerging defacto CAD standard of Non- Uniform Rational B-spline (NURBS) based curve and surface geometry to integrate and streamline the process of turbomachinery structural analysis. We focused our efforts on critical geometric modeling challenges typically posed by the requirements of structural analysts. We developed a suite of software tools that facilitate pre- and post-processing of NURBS-based turbomachinery blade models for finite element structural analyses. We also developed tools to facilitate the modeling of blades in their manufactured (or cold) state based on nominal operating shape and conditions. All of the software developed in the course of this research is written in the C++ language using the Iris Inventor 3D graphical interface tool-kit from Silicon Graphics. In addition to enhanced modularity, improved maintainability, and efficient prototype development, this design facilitates the re-use of code developed for other NASA projects and provides a uniform and professional 'look and feel' for all applications developed by the Iowa State Team.

  3. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    PubMed

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  4. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  5. Developing a Problem-Based Course Based on Needs Analysis to Enhance English Reading Ability of Thai Undergraduate Students

    ERIC Educational Resources Information Center

    Bosuwon, Takwa; Woodrow, Lindy

    2009-01-01

    This paper reports on a needs analysis underlying a proposed business English reading course using a problem-based learning approach designed to enhance English reading abilities of Thai undergraduate students. As part of a work in progress, the needs analysis survey was done prior to the course design with the major stakeholders in business and…

  6. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  7. Simulation-based sensitivity analysis for non-ignorably missing data.

    PubMed

    Yin, Peng; Shi, Jian Q

    2017-01-01

    Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.

  8. Innovative technology-based interventions for autism spectrum disorders: a meta-analysis.

    PubMed

    Grynszpan, Ouriel; Weiss, Patrice L Tamar; Perez-Diaz, Fernando; Gal, Eynat

    2014-05-01

    This article reports the results of a meta-analysis of technology-based intervention studies for children with autism spectrum disorders. We conducted a systematic review of research that used a pre-post design to assess innovative technology interventions, including computer programs, virtual reality, and robotics. The selected studies provided interventions via a desktop computer, interactive DVD, shared active surface, and virtual reality. None employed robotics. The results provide evidence for the overall effectiveness of technology-based training. The overall mean effect size for posttests of controlled studies of children with autism spectrum disorders who received technology-based interventions was significantly different from zero and approached the medium magnitude, d = 0.47 (confidence interval: 0.08-0.86). The influence of age and IQ was not significant. Differences in training procedures are discussed in the light of the negative correlation that was found between the intervention durations and the studies' effect sizes. The results of this meta-analysis provide support for the continuing development, evaluation, and clinical usage of technology-based intervention for individuals with autism spectrum disorders.

  9. Towards a framework for agent-based image analysis of remote-sensing data.

    PubMed

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-04-03

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects' properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA).

  10. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  11. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  12. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  13. Web-Based Analysis for Student-Generated Complex Genetic Profiles

    ERIC Educational Resources Information Center

    Kass, David H.; LaRoe, Robert

    2007-01-01

    A simple, rapid method for generating complex genetic profiles using Alu-based markers was recently developed for students primarily at the undergraduate level to learn more about forensics and paternity analysis. On the basis of the Cold Spring Harbor Allele Server, which provides an excellent tool for analyzing a single Alu variant, we present a…

  14. Histogram analysis for smartphone-based rapid hematocrit determination

    PubMed Central

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  15. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    NASA Astrophysics Data System (ADS)

    Lee, Chien-Cheng; Huang, Shin-Sheng; Shih, Cheng-Yuan

    2010-12-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO) algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  16. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  17. Streaming support for data intensive cloud-based sequence analysis.

    PubMed

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  18. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    PubMed Central

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  19. Ultrasonic test of resistance spot welds based on wavelet package analysis.

    PubMed

    Liu, Jing; Xu, Guocheng; Gu, Xiaopeng; Zhou, Guanghao

    2015-02-01

    In this paper, ultrasonic test of spot welds for stainless steel sheets has been studied. It is indicated that traditional ultrasonic signal analysis in either time domain or frequency domain remains inadequate to evaluate the nugget diameter of spot welds. However, the method based on wavelet package analysis in time-frequency domain can easily distinguish the nugget from the corona bond by extracting high-frequency signals in different positions of spot welds, thereby quantitatively evaluating the nugget diameter. The results of ultrasonic test fit the actual measured value well. Mean value of normal distribution of error statistics is 0.00187, and the standard deviation is 0.1392. Furthermore, the quality of spot welds was evaluated, and it is showed ultrasonic nondestructive test based on wavelet packet analysis can be used to evaluate the quality of spot welds, and it is more reliable than single tensile destructive test. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. [Recognition of landscape characteristic scale based on two-dimension wavelet analysis].

    PubMed

    Gao, Yan-Ni; Chen, Wei; He, Xing-Yuan; Li, Xiao-Yu

    2010-06-01

    Three wavelet bases, i. e., Haar, Daubechies, and Symlet, were chosen to analyze the validity of two-dimension wavelet analysis in recognizing the characteristic scales of the urban, peri-urban, and rural landscapes of Shenyang. Owing to the transform scale of two-dimension wavelet must be the integer power of 2, some characteristic scales cannot be accurately recognized. Therefore, the pixel resolution of images was resampled to 3, 3.5, 4, and 4.5 m to densify the scale in analysis. It was shown that two-dimension wavelet analysis worked effectively in checking characteristic scale. Haar, Daubechies, and Symle were the optimal wavelet bases to the peri-urban landscape, urban landscape, and rural landscape, respectively. Both Haar basis and Symlet basis played good roles in recognizing the fine characteristic scale of rural landscape and in detecting the boundary of peri-urban landscape. Daubechies basis and Symlet basis could be also used to detect the boundary of urban landscape and rural landscape, respectively.

  1. A rule-based system for real-time analysis of control systems

    NASA Astrophysics Data System (ADS)

    Larson, Richard R.; Millard, D. Edward

    1992-10-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  2. A rule-based system for real-time analysis of control systems

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Millard, D. Edward

    1992-01-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  3. Priority of VHS Development Based in Potential Area using Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Meirawan, D.; Ana, A.; Saripudin, S.

    2018-02-01

    The current condition of VHS is still inadequate in quality, quantity and relevance. The purpose of this research is to analyse the development of VHS based on the development of regional potential by using principal component analysis (PCA) in Bandung, Indonesia. This study used descriptive qualitative data analysis using the principle of secondary data reduction component. The method used is Principal Component Analysis (PCA) analysis with Minitab Statistics Software tool. The results of this study indicate the value of the lowest requirement is a priority of the construction of development VHS with a program of majors in accordance with the development of regional potential. Based on the PCA score found that the main priority in the development of VHS in Bandung is in Saguling, which has the lowest PCA value of 416.92 in area 1, Cihampelas with the lowest PCA value in region 2 and Padalarang with the lowest PCA value.

  4. Performance and analysis of MAC protocols based on application

    NASA Astrophysics Data System (ADS)

    Yadav, Ravi; Daniel, A. K.

    2018-04-01

    Wireless Sensor Network is one of the rapid emerging technology in recent decades. It covers large application area as civilian and military. Wireless Sensor Network primary consists of sensor nodes having low-power, low cost and multifunctional activities to collaborates and communicates via wireless medium. The deployment of sensor nodes are adhoc in nature, so sensor nodes are auto organize themselves in such a way to communicate with each other. The characteristics make more challenging areas on WSNs. This paper gives overview about characteristics of WSNs, Architecture and Contention Based MAC protocol. The paper present analysis of various protocol based on performance.

  5. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    PubMed Central

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  6. Content and user-based music visual analysis

    NASA Astrophysics Data System (ADS)

    Guo, Xiaochun; Tang, Lei

    2015-12-01

    In recent years, people's ability to collect music got enhanced greatly. Many people who prefer listening music offline even stored thousands of music on their local storage or portable device. However, their ability to deal with music information has not been improved accordingly, which results in two problems. One is how to find out the favourite songs from large music dataset and satisfy different individuals. The other one is how to compose a play list quickly. To solve these problems, the authors proposed a content and user-based music visual analysis approach. We first developed a new recommendation algorithm based on the content of music and user's behaviour, which satisfy individual's preference. Then, we make use of visualization and interaction tools to illustrate the relationship between songs and help people compose a suitable play list. At the end of this paper, a survey is mentioned to show that our system is available and effective.

  7. HISTORICAL ANALYSIS, A VALUABLE TOOL IN COMMUNITY-BASED ENVIRONMENTAL PROTECTION

    EPA Science Inventory

    A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...

  8. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  9. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  10. Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis

    NASA Astrophysics Data System (ADS)

    Chou, Hui-Yu; Yang, Jyh-Bin

    2017-10-01

    The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.

  11. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  12. Design of microcontroller-based EMG and the analysis of EMG signals.

    PubMed

    Güler, Nihal Fatma; Hardalaç, Firat

    2002-04-01

    In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.

  13. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  14. Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools

    NASA Astrophysics Data System (ADS)

    Sánchez Pineda, A.

    2015-12-01

    We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.

  15. Constructing storyboards based on hierarchical clustering analysis

    NASA Astrophysics Data System (ADS)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  16. Joint source based analysis of multiple brain structures in studying major depressive disorder

    NASA Astrophysics Data System (ADS)

    Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang

    2014-03-01

    We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.

  17. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  18. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  19. Methods and approaches in the topology-based analysis of biological pathways

    PubMed Central

    Mitrea, Cristina; Taghavi, Zeinab; Bokanizad, Behzad; Hanoudi, Samer; Tagett, Rebecca; Donato, Michele; Voichiţa, Călin; Drăghici, Sorin

    2013-01-01

    The goal of pathway analysis is to identify the pathways significantly impacted in a given phenotype. Many current methods are based on algorithms that consider pathways as simple gene lists, dramatically under-utilizing the knowledge that such pathways are meant to capture. During the past few years, a plethora of methods claiming to incorporate various aspects of the pathway topology have been proposed. These topology-based methods, sometimes referred to as “third generation,” have the potential to better model the phenomena described by pathways. Although there is now a large variety of approaches used for this purpose, no review is currently available to offer guidance for potential users and developers. This review covers 22 such topology-based pathway analysis methods published in the last decade. We compare these methods based on: type of pathways analyzed (e.g., signaling or metabolic), input (subset of genes, all genes, fold changes, gene p-values, etc.), mathematical models, pathway scoring approaches, output (one or more pathway scores, p-values, etc.) and implementation (web-based, standalone, etc.). We identify and discuss challenges, arising both in methodology and in pathway representation, including inconsistent terminology, different data formats, lack of meaningful benchmarks, and the lack of tissue and condition specificity. PMID:24133454

  20. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    ERIC Educational Resources Information Center

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  1. Towards a framework for agent-based image analysis of remote-sensing data

    PubMed Central

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-01-01

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects’ properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA). PMID:27721916

  2. Web-based NGS data analysis using miRMaster: a large-scale meta-analysis of human miRNAs.

    PubMed

    Fehlmann, Tobias; Backes, Christina; Kahraman, Mustafa; Haas, Jan; Ludwig, Nicole; Posch, Andreas E; Würstle, Maximilian L; Hübenthal, Matthias; Franke, Andre; Meder, Benjamin; Meese, Eckart; Keller, Andreas

    2017-09-06

    The analysis of small RNA NGS data together with the discovery of new small RNAs is among the foremost challenges in life science. For the analysis of raw high-throughput sequencing data we implemented the fast, accurate and comprehensive web-based tool miRMaster. Our toolbox provides a wide range of modules for quantification of miRNAs and other non-coding RNAs, discovering new miRNAs, isomiRs, mutations, exogenous RNAs and motifs. Use-cases comprising hundreds of samples are processed in less than 5 h with an accuracy of 99.4%. An integrative analysis of small RNAs from 1836 data sets (20 billion reads) indicated that context-specific miRNAs (e.g. miRNAs present only in one or few different tissues / cell types) still remain to be discovered while broadly expressed miRNAs appear to be largely known. In total, our analysis of known and novel miRNAs indicated nearly 22 000 candidates of precursors with one or two mature forms. Based on these, we designed a custom microarray comprising 11 872 potential mature miRNAs to assess the quality of our prediction. MiRMaster is a convenient-to-use tool for the comprehensive and fast analysis of miRNA NGS data. In addition, our predicted miRNA candidates provided as custom array will allow researchers to perform in depth validation of candidates interesting to them. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Hydrological landscape analysis based on digital elevation data

    NASA Astrophysics Data System (ADS)

    Seibert, J.; McGlynn, B.; Grabs, T.; Jensco, K.

    2008-12-01

    Topography is a major factor controlling both hydrological and soil processes at the landscape scale. While this is well-accepted qualitatively, quantifying relationships between topography and spatial variations of hydrologically relevant variables at the landscape scale still remains a challenging research topic. In this presentation, we describe hydrological landscape analysis HLA) as a way to derive relevant topographic indicies to describe the spatial variations of hydrological variables at the landscape scale. We demonstrate our HLA approach with four high-resolution digital elevation models (DEMs) from Sweden, Switzerland and Montana (USA). To investigate scale effects HLA metrics, we compared DEMs of different resolutions. These LiDAR-derived DEMs of 3m, 10m, and 30m, resolution represent catchments of ~ 5 km2 ranging from low to high relief. A central feature of HLA is the flowpath-based analysis of topography and the separation of hillslopes, riparian areas, and the stream network. We included the following metrics: riparian area delineation, riparian buffer potential, separation of stream inflows into right and left bank components, travel time proxies based on flowpath distances and gradients to the channel, and as a hydrologic similarity to the hypsometric curve we suggest the distribution of elevations above the stream network (computed based on the location where a certain flow pathway enters the stream). Several of these indices depended clearly on DEM resolution, whereas this effect was minor for others. While the hypsometric curves all were S-shaped the 'hillslope-hypsometric curves' had the shape of a power function with exponents less than 1. In a similar way we separated flow pathway lengths and gradients between hillslopes and streams and compared a topographic travel time proxy, which was based on the integration of gradients along the flow pathways. Besides the comparison of HLA-metrics for different catchments and DEM resolutions we present

  4. A program to form a multidisciplinary data base and analysis for dynamic systems

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Suit, W. T.; Mayo, M. H.

    1984-01-01

    Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.

  5. Cloud-based data-proximate visualization and analysis

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  6. Alternative model for administration and analysis of research-based assessments

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.

    2016-06-01

    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.

  7. Smartphone-based colorimetric analysis for detection of saliva alcohol concentration.

    PubMed

    Jung, Youngkee; Kim, Jinhee; Awofeso, Olumide; Kim, Huisung; Regnier, Fred; Bae, Euiwon

    2015-11-01

    A simple device and associated analytical methods are reported. We provide objective and accurate determination of saliva alcohol concentrations using smartphone-based colorimetric imaging. The device utilizes any smartphone with a miniature attachment that positions the sample and provides constant illumination for sample imaging. Analyses of histograms based on channel imaging of red-green-blue (RGB) and hue-saturation-value (HSV) color space provide unambiguous determination of blood alcohol concentration from color changes on sample pads. A smartphone-based sample analysis by colorimetry was developed and tested with blind samples that matched with the training sets. This technology can be adapted to any smartphone and used to conduct color change assays.

  8. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  9. Application of gas chromatography to analysis of spirit-based alcoholic beverages.

    PubMed

    Wiśniewska, Paulina; Śliwińska, Magdalena; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek

    2015-01-01

    Spirit-based beverages are alcoholic drinks; their production processes are dependent on the type and origin of raw materials. The composition of this complex matrix is difficult to analyze, and scientists commonly choose gas chromatography techniques for this reason. With a wide selection of extraction methods and detectors it is possible to provide qualitative and quantitative analysis for many chemical compounds with various functional groups. This article describes different types of gas chromatography techniques and their most commonly used associated extraction techniques (e.g., LLE, SPME, SPE, SFE, and SBME) and detectors (MS, TOFMS, FID, ECD, NPD, AED, O or EPD). Additionally, brief characteristics of internationally popular spirit-based beverages and application of gas chromatography to the analysis of selected alcoholic drinks are presented.

  10. Recurrence quantity analysis based on singular value decomposition

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2017-05-01

    Recurrence plot (RP) has turned into a powerful tool in many different sciences in the last three decades. To quantify the complexity and structure of RP, recurrence quantification analysis (RQA) has been developed based on the measures of recurrence density, diagonal lines, vertical lines and horizontal lines. This paper will study the RP based on singular value decomposition which is a new perspective of RP study. Principal singular value proportion (PSVP) will be proposed as one new RQA measure and bigger PSVP means higher complexity for one system. In contrast, smaller PSVP reflects a regular and stable system. Considering the advantage of this method in detecting the complexity and periodicity of systems, several simulation and real data experiments are chosen to examine the performance of this new RQA.

  11. Topic model-based mass spectrometric data analysis in cancer biomarker discovery studies.

    PubMed

    Wang, Minkun; Tsai, Tsung-Heng; Di Poto, Cristina; Ferrarini, Alessia; Yu, Guoqiang; Ressom, Habtom W

    2016-08-18

    A fundamental challenge in quantitation of biomolecules for cancer biomarker discovery is owing to the heterogeneous nature of human biospecimens. Although this issue has been a subject of discussion in cancer genomic studies, it has not yet been rigorously investigated in mass spectrometry based proteomic and metabolomic studies. Purification of mass spectometric data is highly desired prior to subsequent analysis, e.g., quantitative comparison of the abundance of biomolecules in biological samples. We investigated topic models to computationally analyze mass spectrometric data considering both integrated peak intensities and scan-level features, i.e., extracted ion chromatograms (EICs). Probabilistic generative models enable flexible representation in data structure and infer sample-specific pure resources. Scan-level modeling helps alleviate information loss during data preprocessing. We evaluated the capability of the proposed models in capturing mixture proportions of contaminants and cancer profiles on LC-MS based serum proteomic and GC-MS based tissue metabolomic datasets acquired from patients with hepatocellular carcinoma (HCC) and liver cirrhosis as well as synthetic data we generated based on the serum proteomic data. The results we obtained by analysis of the synthetic data demonstrated that both intensity-level and scan-level purification models can accurately infer the mixture proportions and the underlying true cancerous sources with small average error ratios (<7 %) between estimation and ground truth. By applying the topic model-based purification to mass spectrometric data, we found more proteins and metabolites with significant changes between HCC cases and cirrhotic controls. Candidate biomarkers selected after purification yielded biologically meaningful pathway analysis results and improved disease discrimination power in terms of the area under ROC curve compared to the results found prior to purification. We investigated topic model-based

  12. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  13. Sources and Nature of Cost Analysis Data Base Reference Manual.

    DTIC Science & Technology

    1983-07-01

    COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON

  14. Direct sequencing of mitochondrial DNA detects highly divergent haplotypes in blue marlin (Makaira nigricans).

    PubMed

    Finnerty, J R; Block, B A

    1992-06-01

    We were able to differentiate between species of billfish (Istiophoridae family) and to detect considerable intraspecific variation in the blue marlin (Makaira nigricans) by directly sequencing a polymerase chain reaction (PCR)-amplified, 612-bp fragment of the mitochondrial cytochrome b gene. Thirteen variable nucleotide sites separated blue marlin (n = 26) into 7 genotypes. On average, these genotypes differed by 5.7 base substitutions. A smaller sample of swordfish from an equally broad geographic distribution displayed relatively little intraspecific variation, with an average of 1.3 substitutions separating different genotypes. A cladistic analysis of blue marlin cytochrome b variants indicates two major divergent evolutionary lines within the species. The frequencies of these two major evolutionary lines differ significantly between Atlantic and Pacific ocean basins. This finding is important given that the Atlantic stocks of blue marlin are considered endangered. Migration from the Pacific can help replenish the numbers of blue marlin in the Atlantic, but the loss of certain mitochondrial DNA haplotypes in the Atlantic due to overfishing probably could not be remedied by an influx of Pacific fish because of their absence in the Pacific population. Fishery management strategies should attempt to preserve the genetic diversity within the species. The detection of DNA sequence polymorphism indicates the utility of PCR technology in pelagic fishery genetics.

  15. The phylogeny of mole crickets (Orthoptera: Gryllotalpoidea: Gryllotalpidae).

    PubMed

    Cadena-Castañeda, Oscar J

    2015-07-14

    The monophyly and phylogenetic relationships of the family Gryllotalpidae were researched. Twenty-six in-group taxa, representing all known genera of Gryllotalpidae were included in a cladistic analysis, based on 89, morphological characters (including genital characters of the males). The different analyses of the resulting data matrix supported the monophyly of Scapteriscinae and Gryllotalpinae and its internal group. Subfamilies, tribes and genera of Gryllotalpidae are fully diagnosed, illustrated and keys to their identification are provided. Four tribes are established (Indioscaptorini n. trib. (Scapteriscinae), Triamescaptorini n. trib., Gryllotalpellini n. trib. and Neocurtillini n. trib. (Gryllotalpinae)) and two other are fully delimited (Scapteriscini stat. rev. and Gryllotalpini stat. rev.). Two new genera are described (Neoscapteriscus n. gen. and Leptocurtilla n. gen.) and as well as seven new species: Gryllotalpella rehni n. sp., G. tindalei n. sp., G. lawrencei n. sp., Neocurtilla ingrischi n. sp., N. townsendi n. sp., Leptocurtilla juanmanueli n. sp. and L. chopardi n. sp. The following nomenclatural changes were made: All species previously placed in Scapteriscus s.l. are transferred to the new genus Neoscapteriscus, except Scapteriscus oxydactilus and S. headsi that are still placed in Scapteriscus, Gryllotalpa chilensis reinst. stat. and Leptocurtilla maranona, n. comb. Finally, previous contributions about the phylogenetic relationships of molecrickets are contrasted with the results of this research.

  16. Oldest skeleton of a plesiadapiform provides additional evidence for an exclusively arboreal radiation of stem primates in the Palaeocene

    NASA Astrophysics Data System (ADS)

    Chester, Stephen G. B.; Williamson, Thomas E.; Bloch, Jonathan I.; Silcox, Mary T.; Sargis, Eric J.

    2017-05-01

    Palaechthonid plesiadapiforms from the Palaeocene of western North America have long been recognized as among the oldest and most primitive euarchontan mammals, a group that includes extant primates, colugos and treeshrews. Despite their relatively sparse fossil record, palaechthonids have played an important role in discussions surrounding adaptive scenarios for primate origins for nearly a half-century. Likewise, palaechthonids have been considered important for understanding relationships among plesiadapiforms, with members of the group proposed as plausible ancestors of Paromomyidae and Microsyopidae. Here, we describe a dentally associated partial skeleton of Torrejonia wilsoni from the early Palaeocene (approx. 62 Ma) of New Mexico, which is the oldest known plesiadapiform skeleton and the first postcranial elements recovered for a palaechthonid. Results from a cladistic analysis that includes new data from this skeleton suggest that palaechthonids are a paraphyletic group of stem primates, and that T. wilsoni is most closely related to paromomyids. New evidence from the appendicular skeleton of T. wilsoni fails to support an influential hypothesis based on inferences from craniodental morphology that palaechthonids were terrestrial. Instead, the postcranium of T. wilsoni indicates that it was similar to that of all other plesiadapiforms for which skeletons have been recovered in having distinct specializations consistent with arboreality.

  17. Structure-Based Phylogenetic Analysis of the Lipocalin Superfamily.

    PubMed

    Lakshmi, Balasubramanian; Mishra, Madhulika; Srinivasan, Narayanaswamy; Archunan, Govindaraju

    2015-01-01

    Lipocalins constitute a superfamily of extracellular proteins that are found in all three kingdoms of life. Although very divergent in their sequences and functions, they show remarkable similarity in 3-D structures. Lipocalins bind and transport small hydrophobic molecules. Earlier sequence-based phylogenetic studies of lipocalins highlighted that they have a long evolutionary history. However the molecular and structural basis of their functional diversity is not completely understood. The main objective of the present study is to understand functional diversity of the lipocalins using a structure-based phylogenetic approach. The present study with 39 protein domains from the lipocalin superfamily suggests that the clusters of lipocalins obtained by structure-based phylogeny correspond well with the functional diversity. The detailed analysis on each of the clusters and sub-clusters reveals that the 39 lipocalin domains cluster based on their mode of ligand binding though the clustering was performed on the basis of gross domain structure. The outliers in the phylogenetic tree are often from single member families. Also structure-based phylogenetic approach has provided pointers to assign putative function for the domains of unknown function in lipocalin family. The approach employed in the present study can be used in the future for the functional identification of new lipocalin proteins and may be extended to other protein families where members show poor sequence similarity but high structural similarity.

  18. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    PubMed

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-10-01

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.

  19. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  20. Predicting epileptic seizures from scalp EEG based on attractor state analysis.

    PubMed

    Chu, Hyunho; Chung, Chun Kee; Jeong, Woorim; Cho, Kwang-Hyun

    2017-05-01

    Epilepsy is the second most common disease of the brain. Epilepsy makes it difficult for patients to live a normal life because it is difficult to predict when seizures will occur. In this regard, if seizures could be predicted a reasonable period of time before their occurrence, epilepsy patients could take precautions against them and improve their safety and quality of life. In this paper, we investigate a novel seizure precursor based on attractor state analysis for seizure prediction. We analyze the transition process from normal to seizure attractor state and investigate a precursor phenomenon seen before reaching the seizure attractor state. From the result of an analysis, we define a quantified spectral measure in scalp EEG for seizure prediction. From scalp EEG recordings, the Fourier coefficients of six EEG frequency bands are extracted, and the defined spectral measure is computed based on the coefficients for each half-overlapped 20-second-long window. The computed spectral measure is applied to seizure prediction using a low-complexity methodology. Within scalp EEG, we identified an early-warning indicator before an epileptic seizure occurs. Getting closer to the bifurcation point that triggers the transition from normal to seizure state, the power spectral density of low frequency bands of the perturbation of an attractor in the EEG, showed a relative increase. A low-complexity seizure prediction algorithm using this feature was evaluated, using ∼583h of scalp EEG in which 143 seizures in 16 patients were recorded. With the test dataset, the proposed method showed high sensitivity (86.67%) with a false prediction rate of 0.367h -1 and average prediction time of 45.3min. A novel seizure prediction method using scalp EEG, based on attractor state analysis, shows potential for application with real epilepsy patients. This is the first study in which the seizure-precursor phenomenon of an epileptic seizure is investigated based on attractor-based

  1. Mysql Data-Base Applications for Dst-Like Physics Analysis

    NASA Astrophysics Data System (ADS)

    Tsenov, Roumen

    2004-07-01

    The data and analysis model developed and being used in the HARP experiment for studying hadron production at CERN Proton Synchrotron is discussed. Emphasis is put on usage of data-base (DB) back-ends for persistent storing and retrieving "alive" C++ objects encapsulating raw and reconstructed data. Concepts of "Data Summary Tape" (DST) as a logical collection of DB-persistent data of different types, and of "intermediate DST" (iDST) as a physical "tag" of DST, are introduced. iDST level of persistency allows a powerful, DST-level of analysis to be performed by applications running on an isolated machine (even laptop) with no connection to the experiment's main data storage. Implementation of these concepts is considered.

  2. Economic analysis of transmission line engineering based on industrial engineering

    NASA Astrophysics Data System (ADS)

    Li, Yixuan

    2017-05-01

    The modern industrial engineering is applied to the technical analysis and cost analysis of power transmission and transformation engineering. It can effectively reduce the cost of investment. First, the power transmission project is economically analyzed. Based on the feasibility study of power transmission and transformation project investment, the proposal on the company system cost management is put forward through the economic analysis of the effect of the system. The cost management system is optimized. Then, through the cost analysis of power transmission and transformation project, the new situation caused by the cost of construction is found. It is of guiding significance to further improve the cost management of power transmission and transformation project. Finally, according to the present situation of current power transmission project cost management, concrete measures to reduce the cost of power transmission project are given from the two aspects of system optimization and technology optimization.

  3. Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis

    PubMed Central

    Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.

    2015-01-01

    Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505

  4. Imputation-Based Meta-Analysis of Severe Malaria in Three African Populations

    PubMed Central

    Band, Gavin; Le, Quang Si; Jostins, Luke; Pirinen, Matti; Kivinen, Katja; Jallow, Muminatou; Sisay-Joof, Fatoumatta; Bojang, Kalifa; Pinder, Margaret; Sirugo, Giorgio; Conway, David J.; Nyirongo, Vysaul; Kachala, David; Molyneux, Malcolm; Taylor, Terrie; Ndila, Carolyne; Peshu, Norbert; Marsh, Kevin; Williams, Thomas N.; Alcock, Daniel; Andrews, Robert; Edkins, Sarah; Gray, Emma; Hubbart, Christina; Jeffreys, Anna; Rowlands, Kate; Schuldt, Kathrin; Clark, Taane G.; Small, Kerrin S.; Teo, Yik Ying; Kwiatkowski, Dominic P.; Rockett, Kirk A.; Barrett, Jeffrey C.; Spencer, Chris C. A.

    2013-01-01

    Combining data from genome-wide association studies (GWAS) conducted at different locations, using genotype imputation and fixed-effects meta-analysis, has been a powerful approach for dissecting complex disease genetics in populations of European ancestry. Here we investigate the feasibility of applying the same approach in Africa, where genetic diversity, both within and between populations, is far more extensive. We analyse genome-wide data from approximately 5,000 individuals with severe malaria and 7,000 population controls from three different locations in Africa. Our results show that the standard approach is well powered to detect known malaria susceptibility loci when sample sizes are large, and that modern methods for association analysis can control the potential confounding effects of population structure. We show that pattern of association around the haemoglobin S allele differs substantially across populations due to differences in haplotype structure. Motivated by these observations we consider new approaches to association analysis that might prove valuable for multicentre GWAS in Africa: we relax the assumptions of SNP–based fixed effect analysis; we apply Bayesian approaches to allow for heterogeneity in the effect of an allele on risk across studies; and we introduce a region-based test to allow for heterogeneity in the location of causal alleles. PMID:23717212

  5. Spatial-temporal discriminant analysis for ERP-based brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2013-03-01

    Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.

  6. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  7. The Analysis of Image Segmentation Hierarchies with a Graph-based Knowledge Discovery System

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Cooke, diane J.; Ketkar, Nikhil; Aksoy, Selim

    2008-01-01

    Currently available pixel-based analysis techniques do not effectively extract the information content from the increasingly available high spatial resolution remotely sensed imagery data. A general consensus is that object-based image analysis (OBIA) is required to effectively analyze this type of data. OBIA is usually a two-stage process; image segmentation followed by an analysis of the segmented objects. We are exploring an approach to OBIA in which hierarchical image segmentations provided by the Recursive Hierarchical Segmentation (RHSEG) software developed at NASA GSFC are analyzed by the Subdue graph-based knowledge discovery system developed by a team at Washington State University. In this paper we discuss out initial approach to representing the RHSEG-produced hierarchical image segmentations in a graphical form understandable by Subdue, and provide results on real and simulated data. We also discuss planned improvements designed to more effectively and completely convey the hierarchical segmentation information to Subdue and to improve processing efficiency.

  8. Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations

    DTIC Science & Technology

    2013-03-01

    TITLE: Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations PRINCIPAL INVESTIGATOR: Fengshan Liu...SUBTITLE 5a. CONTRACT NUMBER Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations 5b. GRANT NUMBER...identifying the prevalence of women with incomplete visualization of the breast . We developed a code to estimate the breast cancer risks using the

  9. AVR Microcontroller-based automated technique for analysis of DC motors

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  10. Analysis of Fat Intake Based on the US Department of ...

    EPA Pesticide Factsheets

    EPA released the final report, Analysis of Fat Intake Based on USDA’s 1994-1996, 1998 Continuing Survey of Food Intakes by Individuals (CSFII, Final Report). For this report, the EPA conducted an analysis of fat consumption across the U.S. population based on data derived from the U.S. Department of Agriculture's 1994-96, 1998 Continuing Survey of Food Intakes by Individuals (CSFII) and EPA's Food Commodity Intake Database (FCID). Percentiles of fat consumption were calculated on the basis of total mass and on a per-unit body-weight basis for 12 food categories and 98 demographic cohorts. In addition, dietary breakdown and fat intake percentiles were calculated for a subset of the sample population whose consumption of animal fats exceeded the 90th percentile within its age group. Many chemicals found in the environment tend to accumulate in fatty tissue. Assessing risks from these chemicals requires knowledge of dietary habits and the amount of fat present in various types of foods.

  11. Safety and efficacy of cell-based therapy on critical limb ischemia: A meta-analysis.

    PubMed

    Ai, Min; Yan, Chang-Fu; Xia, Fu-Chun; Zhou, Shuang-Lu; He, Jian; Li, Cui-Ping

    2016-06-01

    Critical limb ischemia (CLI) is a major health problem worldwide, affecting approximately 500-1000 people per million per annum. Cell-based therapy has given new hope for the treatment of limb ischemia. This study assessed the safety and efficacy of cellular therapy CLI treatment. We searched the PubMed, Embase and Cochrane databases through October 20, 2015, and selected the controlled trials with cell-based therapy for CLI treatment compared with cell-free treatment. We assessed the results by meta-analysis using a variety of outcome measures, as well as the association of mononuclear cell dosage with treatment effect by dose-response meta-analysis. Twenty-five trials were included. For the primary evaluation index, cell-based therapy significantly reduced the rate of major amputation (odds ratio [OR] 0.44, 95% confidence interval [CI] 0.32-0.60, P = 0.000) and significantly increased the rate of amputation-free survival (OR 2.80, 95% CI 1.70-4.61, P = 0.000). Trial sequence analysis indicated that optimal sample size (n = 3374) is needed to detect a plausible treatment effect in all-cause mortality. Cell-based therapy significantly improves ankle brachial index, increases the rate of ulcer healing, increases the transcutaneous pressure of oxygen, reduces limb pain and improves movement ability. Subgroup analysis indicated heterogeneity is caused by type of control, design bias and transplant route. In the dose-response analysis, there was no significant correlation between cell dosage and the therapeutic effect. Cell-based therapy has a significant therapeutic effect on CLI, but randomized double-blind placebo-controlled trials are needed to improve the credibility of this conclusion. Assessment of all-cause mortality also requires a larger sample size to arrive at a strong conclusion. In dose-response analysis, increasing the dosage of cell injections does not significantly improve the therapeutic effects of cell-based therapy. Copyright © 2016

  12. Power quality analysis based on spatial correlation

    NASA Astrophysics Data System (ADS)

    Li, Jiangtao; Zhao, Gang; Liu, Haibo; Li, Fenghou; Liu, Xiaoli

    2018-03-01

    With the industrialization and urbanization, the status of electricity in the production and life is getting higher and higher. So the prediction of power quality is the more potential significance. Traditional power quality analysis methods include: power quality data compression, disturbance event pattern classification, disturbance parameter calculation. Under certain conditions, these methods can predict power quality. This paper analyses the temporal variation of power quality of one provincial power grid in China from time angle. The distribution of power quality was analyzed based on spatial autocorrelation. This paper tries to prove that the research idea of geography is effective for mining the potential information of power quality.

  13. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  14. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  15. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  16. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  17. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  18. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  19. Conceptual bases of Christian, faith-based substance abuse rehabilitation programs: qualitative analysis of staff interviews.

    PubMed

    McCoy, Lisa K; Hermos, John A; Bokhour, Barbara G; Frayne, Susan M

    2004-09-01

    Faith-based substance abuse rehabilitation programs provide residential treatment for many substance abusers. To determine key governing concepts of such programs, we conducted semi-structured interviews with sample of eleven clinical and administrative staff referred to us by program directors at six, Evangelical Christian, faith-based, residential rehabilitation programs representing two large, nationwide networks. Qualitative analysis using grounded theory methods examined how spirituality is incorporated into treatment and elicited key theories of addiction and recovery. Although containing comprehensive secular components, the core activities are strongly rooted in a Christian belief system that informs their understanding of addiction and recovery and drives the treatment format. These governing conceptions, that addiction stems from attempts to fill a spiritual void through substance use and recovery through salvation and a long-term relationship with God, provide an explicit, theory-driven model upon which they base their core treatment activities. Knowledge of these core concepts and practices should be helpful to clinicians in considering referrals to faith-based recovery programs.

  20. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  1. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  2. Space tug economic analysis study. Volume 2: Tug concepts analysis. Appendix: Tug design and performance data base

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.

  3. An insect-like mushroom body in a crustacean brain

    PubMed Central

    Wolff, Gabriella Hannah; Thoen, Hanne Halkinrud; Marshall, Justin; Sayre, Marcel E

    2017-01-01

    Mushroom bodies are the iconic learning and memory centers of insects. No previously described crustacean possesses a mushroom body as defined by strict morphological criteria although crustacean centers called hemiellipsoid bodies, which serve functions in sensory integration, have been viewed as evolutionarily convergent with mushroom bodies. Here, using key identifiers to characterize neural arrangements, we demonstrate insect-like mushroom bodies in stomatopod crustaceans (mantis shrimps). More than any other crustacean taxon, mantis shrimps display sophisticated behaviors relating to predation, spatial memory, and visual recognition comparable to those of insects. However, neuroanatomy-based cladistics suggesting close phylogenetic proximity of insects and stomatopod crustaceans conflicts with genomic evidence showing hexapods closely related to simple crustaceans called remipedes. We discuss whether corresponding anatomical phenotypes described here reflect the cerebral morphology of a common ancestor of Pancrustacea or an extraordinary example of convergent evolution. PMID:28949916

  4. Phylogeny of the order Choreotrichida (Ciliophora, Spirotricha, Oligotrichea) as inferred from morphology, ultrastructure, ontogenesis, and SSrRNA gene sequences

    PubMed Central

    Agatha, Sabine; Strüder-Kypke, Michaela C.

    2010-01-01

    The phylogeny within the order Choreotrichida is reconstructed using (i) morphologic, ontogenetic, and ultrastructural evidence for the cladistic approach and (ii) the small subunit ribosomal RNA (SSrRNA) gene sequences, including the new sequence of Rimostrombidium lacustris. The morphologic cladograms and the gene trees converge rather well for the Choreotrichida, demonstrating that hyaline and agglutinated loricae do not characterize distinct lineages, i.e., both lorica types can be associated with the most highly developed ciliary pattern. The position of Rimostrombidium lacustris within the family Strobilidiidae is corroborated by the genealogical analyses. The diagnosis of the genus Tintinnidium is improved, adding cytological features, and the genus is divided into two subgenera based on the structure of the somatic kineties. The diagnosis of the family Lohmanniellidae and the genus Lohmanniella are improved, and Rimostrombidium glacicolum​ Petz, Song and Wilbert, 1995 is affiliated. PMID:17166704

  5. A Diminutive New Tyrannosaur from the Top of the World

    PubMed Central

    Fiorillo, Anthony R.; Tykoski, Ronald S.

    2014-01-01

    Tyrannosaurid theropods were dominant terrestrial predators in Asia and western North America during the last of the Cretaceous. The known diversity of the group has dramatically increased in recent years with new finds, but overall understanding of tyrannosaurid ecology and evolution is based almost entirely on fossils from latitudes at or below southern Canada and central Asia. Remains of a new, relatively small tyrannosaurine were recovered from the earliest Late Maastrichtian (70-69Ma) of the Prince Creek Formation on Alaska's North Slope. Cladistic analyses show the material represents a new tyrannosaurine species closely related to the highly derived Tarbosaurus+Tyrannosaurus clade. The new taxon inhabited a seasonally extreme high-latitude continental environment on the northernmost edge of Cretaceous North America. The discovery of the new form provides new insights into tyrannosaurid adaptability, and evolution in an ancient greenhouse Arctic. PMID:24621577

  6. Bismuth-based electrochemical stripping analysis

    DOEpatents

    Wang, Joseph

    2004-01-27

    Method and apparatus for trace metal detection and analysis using bismuth-coated electrodes and electrochemical stripping analysis. Both anodic stripping voltammetry and adsorptive stripping analysis may be employed.

  7. First Miocene rodent from Lebanon provides the 'missing link' between Asian and African gundis (Rodentia: Ctenodactylidae).

    PubMed

    López-Antoñanzas, Raquel; Knoll, Fabien; Maksoud, Sibelle; Azar, Dany

    2015-08-07

    Ctenodactylinae (gundis) is a clade of rodents that experienced, in Miocene time, their greatest diversification and widest distribution. They expanded from the Far East, their area of origin, to Africa, which they entered from what would become the Arabian Peninsula. Questions concerning the origin of African Ctenodactylinae persist essentially because of a poor fossil record from the Miocene of Afro-Arabia. However, recent excavations in the Late Miocene of Lebanon have yielded a key taxon for our understanding of these issues. Proafricanomys libanensis nov. gen. nov. sp. shares a variety of dental characters with both the most primitive and derived members of the subfamily. A cladistic analysis demonstrates that this species is the sister taxon to a clade encompassing all but one of the African ctenodactylines, plus a southern European species of obvious African extraction. As such, Proafricanomys provides the 'missing link' between the Asian and African gundis.

  8. First Miocene rodent from Lebanon provides the 'missing link' between Asian and African gundis (Rodentia: Ctenodactylidae)

    PubMed Central

    López-Antoñanzas, Raquel; Knoll, Fabien; Maksoud, Sibelle; Azar, Dany

    2015-01-01

    Ctenodactylinae (gundis) is a clade of rodents that experienced, in Miocene time, their greatest diversification and widest distribution. They expanded from the Far East, their area of origin, to Africa, which they entered from what would become the Arabian Peninsula. Questions concerning the origin of African Ctenodactylinae persist essentially because of a poor fossil record from the Miocene of Afro-Arabia. However, recent excavations in the Late Miocene of Lebanon have yielded a key taxon for our understanding of these issues. Proafricanomys libanensis nov. gen. nov. sp. shares a variety of dental characters with both the most primitive and derived members of the subfamily. A cladistic analysis demonstrates that this species is the sister taxon to a clade encompassing all but one of the African ctenodactylines, plus a southern European species of obvious African extraction. As such, Proafricanomys provides the 'missing link' between the Asian and African gundis. PMID:26250050

  9. Trade-Off Analysis between Concerns Based on Aspect-Oriented Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Laurito, Abelyn Methanie R.; Takada, Shingo

    The identification of functional and non-functional concerns is an important activity during requirements analysis. However, there may be conflicts between the identified concerns, and they must be discovered and resolved through trade-off analysis. Aspect-Oriented Requirements Engineering (AORE) has trade-off analysis as one of its goals, but most AORE approaches do not actually offer support for trade-off analysis; they focus on describing concerns and generating their composition. This paper proposes an approach for trade-off analysis based on AORE using use cases and the Requirements Conflict Matrix (RCM) to represent compositions. RCM shows the positive or negative effect of non-functional concerns over use cases and other non-functional concerns. Our approach is implemented within a tool called E-UCEd (Extended Use Case Editor). We also show the results of evaluating our tool.

  10. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    PubMed

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  11. White matter fiber-based analysis of T1w/T2w ratio map

    NASA Astrophysics Data System (ADS)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  12. A study of microstructural characteristics and differential thermal analysis of Ni-based superalloys

    NASA Technical Reports Server (NTRS)

    Aggarwal, M. D.; Lal, R. B.; Oyekenu, Samuel A.; Parr, Richard; Gentz, Stephen

    1989-01-01

    The objective of this work is to correlate the mechanical properties of the Ni-based superalloy MAR M246(Hf) used in the Space Shuttle Main Engine with its structural characteristics by systematic study of optical photomicrographs and differential thermal analysis. The authors developed a method of predicting the liquidus and solidus temperature of various nickel based superalloys (MAR-M247, Waspaloy, Udimet-41, polycrystalline and single crystals of CMSX-2 and CMSX-3) and comparing the predictions with the experimental differential thermal analysis (DTA) curves using Perkin-Elmer DTA 1700. The method of predicting these temperatures is based on the additive effect of the components dissolved in nickel. The results were compared with the experimental values.

  13. Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition

    NASA Astrophysics Data System (ADS)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso

    2005-04-01

    Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.

  14. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    PubMed

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Analysis on the workspace of palletizing robot based on AutoCAD

    NASA Astrophysics Data System (ADS)

    Li, Jin-quan; Zhang, Rui; Guan, Qi; Cui, Fang; Chen, Kuan

    2017-10-01

    In this paper, a four-degree-of-freedom articulated palletizing robot is used as the object of research. Based on the analysis of the overall configuration of the robot, the kinematic mathematical model is established by D-H method to figure out the workspace of the robot. In order to meet the needs of design and analysis, using AutoCAD secondary development technology and AutoLisp language to develop AutoCAD-based 2D and 3D workspace simulation interface program of palletizing robot. At last, using AutoCAD plugin, the influence of structural parameters on the shape and position of the working space is analyzed when the structure parameters of the robot are changed separately. This study laid the foundation for the design, control and planning of palletizing robots.

  16. Performance of Koyna dam based on static and dynamic analysis

    NASA Astrophysics Data System (ADS)

    Azizan, Nik Zainab Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar

    2017-10-01

    This paper discusses the performance of Koyna dam based on static pushover analysis (SPO) and incremental dynamic analysis (IDA). The SPO in this study considered two type of lateral load which is inertial load and hydrodynamic load. The structure was analyse until the damage appears on the structure body. The IDA curves were develop based on 7 ground motion, where the characteristic of the ground motions: i) the distance from the epicenter is less than 15km, (ii) the magnitude is equal to or greater than 5.5 and (iii) the PGA is equal to or greater than 0.15g. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. Elastic respond spectrum developed based on soil type B by using Eurocode 8. By using SPO and IDA method are able to determine the limit states of the dam. The limit state proposed in this study are yielding and ultimate state which is identified base on crack pattern perform on the structure model. The comparison of maximum crest displacement for both methods is analysed to define the limit state of the dam. The displacement of yielding state for Koyna dam is 23.84mm and 44.91mm for the ultimate state. The results are able to be used as a guideline to monitor Koyna dam under seismic loadings which are considering static and dynamic.

  17. Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.

    ERIC Educational Resources Information Center

    De Grave, W. S.; And Others

    1996-01-01

    To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…

  18. Constraints based analysis of extended cybernetic models.

    PubMed

    Mandli, Aravinda R; Venkatesh, Kareenhalli V; Modak, Jayant M

    2015-11-01

    The cybernetic modeling framework provides an interesting approach to model the regulatory phenomena occurring in microorganisms. In the present work, we adopt a constraints based approach to analyze the nonlinear behavior of the extended equations of the cybernetic model. We first show that the cybernetic model exhibits linear growth behavior under the constraint of no resource allocation for the induction of the key enzyme. We then quantify the maximum achievable specific growth rate of microorganisms on mixtures of substitutable substrates under various kinds of regulation and show its use in gaining an understanding of the regulatory strategies of microorganisms. Finally, we show that Saccharomyces cerevisiae exhibits suboptimal dynamic growth with a long diauxic lag phase when growing on a mixture of glucose and galactose and discuss on its potential to achieve optimal growth with a significantly reduced diauxic lag period. The analysis carried out in the present study illustrates the utility of adopting a constraints based approach to understand the dynamic growth strategies of microorganisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Analysis of Computer Network Information Based on "Big Data"

    NASA Astrophysics Data System (ADS)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  20. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  1. Hunter versus CIE color measurement systems for analysis of milk-based beverages.

    PubMed

    Cheng, Ni; Barbano, David M; Drake, Mary Anne

    2018-06-01

    The objective of our work was to determine the differences in sensitivity of Hunter and International Commission on Illumination (CIE) methods at 2 different viewer angles (2 and 10°) for measurement of whiteness, red/green, and blue/yellow color of milk-based beverages over a range of composition. Sixty combinations of milk-based beverages were formulated (2 replicates) with a range of fat level from 0.2 to 2%, true protein level from 3 to 5%, and casein as a percent of true protein from 5 to 80% to provide a wide range of milk-based beverage color. In addition, commercial skim, 1 and 2% fat high-temperature, short-time pasteurized fluid milks were analyzed. All beverage formulations were HTST pasteurized and cooled to 4°C before analysis. Color measurement viewer angle (2 vs. 10°) had very little effect on objective color measures of milk-based beverages with a wide range of composition for either the Hunter or CIE color measurement system. Temperature (4, 20, and 50°C) of color measurement had a large effect on the results of color measurement in both the Hunter and CIE measurement systems. The effect of milk beverage temperature on color measurement results was the largest for skim milk and the least for 2% fat milk. This highlights the need for proper control of beverage serving temperature for sensory panel analysis of milk-based beverages with very low fat content and for control of milk temperature when doing objective color analysis for quality control in manufacture of milk-based beverages. The Hunter system of color measurement was more sensitive to differences in whiteness among milk-based beverages than the CIE system, whereas the CIE system was much more sensitive to differences in yellowness among milk-based beverages. There was little difference between the Hunter and CIE system in sensitivity to green/red color of milk-based beverages. In defining milk-based beverage product specifications for objective color measures for dairy product

  2. iScreen: Image-Based High-Content RNAi Screening Analysis Tools.

    PubMed

    Zhong, Rui; Dong, Xiaonan; Levine, Beth; Xie, Yang; Xiao, Guanghua

    2015-09-01

    High-throughput RNA interference (RNAi) screening has opened up a path to investigating functional genomics in a genome-wide pattern. However, such studies are often restricted to assays that have a single readout format. Recently, advanced image technologies have been coupled with high-throughput RNAi screening to develop high-content screening, in which one or more cell image(s), instead of a single readout, were generated from each well. This image-based high-content screening technology has led to genome-wide functional annotation in a wider spectrum of biological research studies, as well as in drug and target discovery, so that complex cellular phenotypes can be measured in a multiparametric format. Despite these advances, data analysis and visualization tools are still largely lacking for these types of experiments. Therefore, we developed iScreen (image-Based High-content RNAi Screening Analysis Tool), an R package for the statistical modeling and visualization of image-based high-content RNAi screening. Two case studies were used to demonstrate the capability and efficiency of the iScreen package. iScreen is available for download on CRAN (http://cran.cnr.berkeley.edu/web/packages/iScreen/index.html). The user manual is also available as a supplementary document. © 2014 Society for Laboratory Automation and Screening.

  3. Finite Element Analysis of a Copper Single Crystal Shape Memory Alloy-Based Endodontic Instruments

    NASA Astrophysics Data System (ADS)

    Vincent, Marin; Thiebaud, Frédéric; Bel Haj Khalifa, Saifeddine; Engels-Deutsch, Marc; Ben Zineb, Tarak

    2015-10-01

    The aim of the present paper is the development of endodontic Cu-based single crystal Shape Memory Alloy (SMA) instruments in order to eliminate the antimicrobial and mechanical deficiencies observed with the conventional Nickel-Titane (NiTi) SMA files. A thermomechanical constitutive law, already developed and implemented in a finite element code by our research group, is adopted for the simulation of the single crystal SMA behavior. The corresponding material parameters were identified starting from experimental results for a tensile test at room temperature. A computer-aided design geometry has been achieved and considered for a finite element structural analysis of the endodontic Cu-based single crystal SMA files. They are meshed with tetrahedral continuum elements to improve the computation time and the accuracy of results. The geometric parameters tested in this study are the length of the active blade, the rod length, the pitch, the taper, the tip diameter, and the rod diameter. For each set of adopted parameters, a finite element model is built and tested in a combined bending-torsion loading in accordance with ISO 3630-1 norm. The numerical analysis based on finite element procedure allowed purposing an optimal geometry suitable for Cu-based single crystal SMA endodontic files. The same analysis was carried out for the classical NiTi SMA files and a comparison was made between the two kinds of files. It showed that Cu-based single crystal SMA files are less stiff than the NiTi files. The Cu-based endodontic files could be used to improve the root canal treatments. However, the finite element analysis brought out the need for further investigation based on experiments.

  4. Automatic network coupling analysis for dynamical systems based on detailed kinetic models.

    PubMed

    Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich

    2005-10-01

    We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.

  5. Template based rotation: A method for functional connectivity analysis with a priori templates☆

    PubMed Central

    Schultz, Aaron P.; Chhatwal, Jasmeer P.; Huijbers, Willem; Hedden, Trey; van Dijk, Koene R.A.; McLaren, Donald G.; Ward, Andrew M.; Wigman, Sarah; Sperling, Reisa A.

    2014-01-01

    Functional connectivity magnetic resonance imaging (fcMRI) is a powerful tool for understanding the network level organization of the brain in research settings and is increasingly being used to study large-scale neuronal network degeneration in clinical trial settings. Presently, a variety of techniques, including seed-based correlation analysis and group independent components analysis (with either dual regression or back projection) are commonly employed to compute functional connectivity metrics. In the present report, we introduce template based rotation,1 a novel analytic approach optimized for use with a priori network parcellations, which may be particularly useful in clinical trial settings. Template based rotation was designed to leverage the stable spatial patterns of intrinsic connectivity derived from out-of-sample datasets by mapping data from novel sessions onto the previously defined a priori templates. We first demonstrate the feasibility of using previously defined a priori templates in connectivity analyses, and then compare the performance of template based rotation to seed based and dual regression methods by applying these analytic approaches to an fMRI dataset of normal young and elderly subjects. We observed that template based rotation and dual regression are approximately equivalent in detecting fcMRI differences between young and old subjects, demonstrating similar effect sizes for group differences and similar reliability metrics across 12 cortical networks. Both template based rotation and dual-regression demonstrated larger effect sizes and comparable reliabilities as compared to seed based correlation analysis, though all three methods yielded similar patterns of network differences. When performing inter-network and sub-network connectivity analyses, we observed that template based rotation offered greater flexibility, larger group differences, and more stable connectivity estimates as compared to dual regression and seed based

  6. Support vector machine learning-based fMRI data group analysis.

    PubMed

    Wang, Ze; Childress, Anna R; Wang, Jiongjiong; Detre, John A

    2007-07-15

    To explore the multivariate nature of fMRI data and to consider the inter-subject brain response discrepancies, a multivariate and brain response model-free method is fundamentally required. Two such methods are presented in this paper by integrating a machine learning algorithm, the support vector machine (SVM), and the random effect model. Without any brain response modeling, SVM was used to extract a whole brain spatial discriminance map (SDM), representing the brain response difference between the contrasted experimental conditions. Population inference was then obtained through the random effect analysis (RFX) or permutation testing (PMU) on the individual subjects' SDMs. Applied to arterial spin labeling (ASL) perfusion fMRI data, SDM RFX yielded lower false-positive rates in the null hypothesis test and higher detection sensitivity for synthetic activations with varying cluster size and activation strengths, compared to the univariate general linear model (GLM)-based RFX. For a sensory-motor ASL fMRI study, both SDM RFX and SDM PMU yielded similar activation patterns to GLM RFX and GLM PMU, respectively, but with higher t values and cluster extensions at the same significance level. Capitalizing on the absence of temporal noise correlation in ASL data, this study also incorporated PMU in the individual-level GLM and SVM analyses accompanied by group-level analysis through RFX or group-level PMU. Providing inferences on the probability of being activated or deactivated at each voxel, these individual-level PMU-based group analysis methods can be used to threshold the analysis results of GLM RFX, SDM RFX or SDM PMU.

  7. Aviation Logistics in U.S. Pacific Command: A Cost-Based Analysis and Comparative Advantage to Commercial Shipment

    DTIC Science & Technology

    2012-12-01

    IN U.S. PACIFIC COMMAND: A COST- BASED ANALYSIS AND COMPARATIVE ADVANTAGE TO COMMERCIAL SHIPMENT by Tod B. Diffey Matthew J. Beck December...PACIFIC COMMAND: A COST- BASED ANALYSIS AND COMPARATIVE ADVANTAGE TO COMMERCIAL SHIPMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Tod B. Diffey and Matthew...this study will provide a cost based analysis and qualitative evaluation regarding the use of commercial agencies and/or United States Marine Corps

  8. GIS-based Landing-Site Analysis and Passive Decision Support

    NASA Astrophysics Data System (ADS)

    van Gasselt, Stephan; Nass, Andrea

    2016-04-01

    The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.

  9. Effectiveness of problem-based learning in Chinese pharmacy education: a meta-analysis.

    PubMed

    Zhou, Jiyin; Zhou, Shiwen; Huang, Chunji; Xu, Rufu; Zhang, Zuo; Zeng, Shengya; Qian, Guisheng

    2016-01-19

    This review provides a critical overview of problem-based learning (PBL) practices in Chinese pharmacy education. PBL has yet to be widely applied in pharmaceutical education in China. The results of those studies that have been conducted are published in Chinese and thus may not be easily accessible to international researchers. Therefore, this meta-analysis was carried out to review the effectiveness of PBL. Databases were searched for studies in accordance with the inclusion criteria. Two reviewers independently performed the study identification and data extraction. A meta-analysis was conducted using Revman 5.3 software. Sixteen randomized controlled trials were included. The meta-analysis revealed that PBL had a positive association with higher theoretical scores (SMD = 1.17, 95% CI [0.77, 11.57], P < 0.00001). The questionnaire results show that PBL methods are superior to conventional teaching methods in improving students' learning interest, independent analysis skills, scope of knowledge, self-study, team spirit, and oral expression. This meta-analysis indicates that PBL pedagogy is superior to traditional lecture-based teaching in Chinese pharmacy education. PBL methods could be an optional, supplementary method of pharmaceutical teaching in China. However, Chinese pharmacy colleges and universities should revise PBL curricula according to their own needs, which would maximize the effectiveness of PBL.

  10. Laparoscopic surgery skills evaluation: analysis based on accelerometers.

    PubMed

    Sánchez, Alexis; Rodríguez, Omaira; Sánchez, Renata; Benítez, Gustavo; Pena, Romina; Salamo, Oriana; Baez, Valentina

    2014-01-01

    Technical skills assessment is considered an important part of surgical training. Subjective assessment is not appropriate for training feedback, and there is now increased demand for objective assessment of surgical performance. Economy of movement has been proposed as an excellent alternative for this purpose. The investigators describe a readily available method to evaluate surgical skills through motion analysis using accelerometers in Apple's iPod Touch device. Two groups of individuals with different minimally invasive surgery skill levels (experts and novices) were evaluated. Each group was asked to perform a given task with an iPod Touch placed on the dominant-hand wrist. The Accelerometer Data Pro application makes it possible to obtain movement-related data detected by the accelerometers. Average acceleration and maximum acceleration for each axis (x, y, and z) were determined and compared. The analysis of average acceleration and maximum acceleration showed statistically significant differences between groups on both the y (P = .04, P = .03) and z (P = .04, P = .04) axes. This demonstrates the ability to distinguish between experts and novices. The analysis of the x axis showed no significant differences between groups, which could be explained by the fact that the task involves few movements on this axis. Accelerometer-based motion analysis is a useful tool to evaluate laparoscopic skill development of surgeons and should be used in training programs. Validation of this device in an in vivo setting is a research goal of the investigators' team.

  11. Agricultural Production: Task Analysis for Livestock Production. Competency-Based Education.

    ERIC Educational Resources Information Center

    Henrico County Public Schools, Glen Allen, VA. Virginia Vocational Curriculum Center.

    This task analysis guide is intended to help teachers and administrators develop instructional materials and implement competency-based education in the agricultural production program. Section 1 contains a validated task inventory for the livestock production portion of agricultural production IV and V. Tasks are divided into six duty areas:…

  12. Life-Cycle Analysis and Inquiry-Based Learning in Chemistry Teaching

    ERIC Educational Resources Information Center

    Juntunen, Marianne; Aksela, Maija

    2013-01-01

    The purpose of this design research is to improve the quality of environmental literacy and sustainability education in chemistry teaching through combining a socio-scientific issue, life-cycle analysis (LCA), with inquiry-based learning (IBL). This first phase of the cyclic design research involved 20 inservice trained chemistry teachers from…

  13. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  14. Postpartum sexual health: a principle-based concept analysis.

    PubMed

    O'Malley, Deirdre; Higgins, Agnes; Smith, Valerie

    2015-10-01

    The aim of this study is to report an analysis of the concept of postpartum sexual health. Postpartum sexual health is a minimally understood concept, most often framed within physical/biological dimensions or as a 'checklist' task in postpartum information provision. This has the potential to leave women unprepared to manage transient or normative sexual health changes after childbirth. For meaningful discussions, clarity and understanding of postpartum sexual health is required. A principle-based method of concept analysis. The databases of PubMed, CINAHL, Maternity and Infant Care, PsychInfo, Web of Science, EMBASE, SCOPUS and Social Science Index were systematically searched, from their earliest dates, using a combination of key terms, including; 'sexual health', 'sexual function', 'dyspareunia', 'sexuality', 'sexual desire', 'sexual dysfunction', 'postnatal' and 'postpartum', resulting in a final included dataset of 91 studies. Using the principle-based approach, postpartum sexual health was analysed under the four philosophical principles of epistemological, pragmatic, linguistic and logical. Philosophically, postpartum sexual health is underdeveloped as a concept. A precise theoretical definition remains elusive and, presently, postpartum sexual health cannot be separated theoretically from sexuality and sexual function. Identified antecedents include an instrument free birth, an intact perineum and avoidance of episiotomy. Attributes include sexual arousal, desire, orgasm, sexual satisfaction and resumption of sexual intercourse. Outcomes are sexual satisfaction and a satisfying intimate relationship with one's partner. Postpartum sexual health is conceptually immature with limited applicability in current midwifery practice. © 2015 John Wiley & Sons Ltd.

  15. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  16. Space-based solar power conversion and delivery systems (study), engineering analysis

    NASA Technical Reports Server (NTRS)

    Nathan, C. A.

    1975-01-01

    A systems analysis of synchronous, orbit-based power generation and relay systems that could be operational in the 1990's is described along with a comparison with earth-based systems to be operational in the same time frame. Operational and economic requirements for the orbiting systems and near term research activities which will be required to assure feasibility, development, launch and operational capabilities of such systems in the post- 1990 time frame are examined.

  17. Wavelet decomposition based principal component analysis for face recognition using MATLAB

    NASA Astrophysics Data System (ADS)

    Sharma, Mahesh Kumar; Sharma, Shashikant; Leeprechanon, Nopbhorn; Ranjan, Aashish

    2016-03-01

    For the realization of face recognition systems in the static as well as in the real time frame, algorithms such as principal component analysis, independent component analysis, linear discriminate analysis, neural networks and genetic algorithms are used for decades. This paper discusses an approach which is a wavelet decomposition based principal component analysis for face recognition. Principal component analysis is chosen over other algorithms due to its relative simplicity, efficiency, and robustness features. The term face recognition stands for identifying a person from his facial gestures and having resemblance with factor analysis in some sense, i.e. extraction of the principal component of an image. Principal component analysis is subjected to some drawbacks, mainly the poor discriminatory power and the large computational load in finding eigenvectors, in particular. These drawbacks can be greatly reduced by combining both wavelet transform decomposition for feature extraction and principal component analysis for pattern representation and classification together, by analyzing the facial gestures into space and time domain, where, frequency and time are used interchangeably. From the experimental results, it is envisaged that this face recognition method has made a significant percentage improvement in recognition rate as well as having a better computational efficiency.

  18. Exploratory Model Analysis of the Space Based Infrared System (SBIRS) Low Global Scheduler Problem

    DTIC Science & Technology

    1999-12-01

    solution. The non- linear least squares model is defined as Y = f{e,t) where: 0 =M-element parameter vector Y =N-element vector of all data t...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM (SBIRS) LOW GLOBAL SCHEDULER...December 1999 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM

  19. Brain Based Learning in Science Education in Turkey: Descriptive Content and Meta Analysis of Dissertations

    ERIC Educational Resources Information Center

    Yasar, M. Diyaddin

    2017-01-01

    This study aimed at performing content analysis and meta-analysis on dissertations related to brain-based learning in science education to find out the general trend and tendency of brain-based learning in science education and find out the effect of such studies on achievement and attitude of learners with the ultimate aim of raising awareness…

  20. Computer-based video analysis identifies infants with absence of fidgety movements.

    PubMed

    Støen, Ragnhild; Songstad, Nils Thomas; Silberg, Inger Elisabeth; Fjørtoft, Toril; Jensenius, Alexander Refsum; Adde, Lars

    2017-10-01

    BackgroundAbsence of fidgety movements (FMs) at 3 months' corrected age is a strong predictor of cerebral palsy (CP) in high-risk infants. This study evaluates the association between computer-based video analysis and the temporal organization of FMs assessed with the General Movement Assessment (GMA).MethodsInfants were eligible for this prospective cohort study if referred to a high-risk follow-up program in a participating hospital. Video recordings taken at 10-15 weeks post term age were used for GMA and computer-based analysis. The variation of the spatial center of motion, derived from differences between subsequent video frames, was used for quantitative analysis.ResultsOf 241 recordings from 150 infants, 48 (24.1%) were classified with absence of FMs or sporadic FMs using the GMA. The variation of the spatial center of motion (C SD ) during a recording was significantly lower in infants with normal (0.320; 95% confidence interval (CI) 0.309, 0.330) vs. absence of or sporadic (0.380; 95% CI 0.361, 0.398) FMs (P<0.001). A triage model with C SD thresholds chosen for sensitivity of 90% and specificity of 80% gave a 40% referral rate for GMA.ConclusionQuantitative video analysis during the FMs' period can be used to triage infants at high risk of CP to early intervention or observational GMA.